Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • Thanks Rikki, I did find the canonical tag and thought myself it would have an impact. Just always good to get others opinion. Thanks Robert

    | Trespass
    0

  • Hi Lynn Thank you so much for your answer :). The reason for the moving domain1 to domain2 is because of a new business plan (what we earn money on and what signals we want so send). Knowing the risks, I can better guide my employer to what options we have and what the outcome might be. So your answer helped a lot - thank you :). Maja

    | Bulpen
    0

  • On your beta sites in future, I would recommend using Basic HTTP Authentication so that spiders can't even access it (this is for Apache): AuthUserFile /var/www/sites/passwdfile AuthName "Beta Realm" AuthType Basic require valid-user Then htpasswd -m /var/www/sites/passwdfile username If you do this as well, Google's Removal Tool will go "ok its not there I should remove the page" as well, because they usually ask for content in the page as a check for removal. If you don't remove the text, they MAY not process the removal request (even if it has noindex [though I don't know if that's the case]).

    | Vuly
    0

  • It looks like most of the content on your site is duplicate content. That's your biggest problem. Google doesn't like duplicate content, so if you want to rank, you're going to have to create content that is unique to your site. Reducing the number of links you have on the page can help, and will probably be a better user experience. No use overwhelming the user or confusing the search spiders with tons of links. Finally, do not nofollow your internal pages. This will not preserve page rank, it only results in link juice evaporation.

    | TakeshiYoung
    0

  • Hello Mathew, I did a site:domain.com search and do still see some of the old URLs indexed so I checked the URLs using an HTTP header status code checker and they are returning the correct 301 response. I also checked the the rel canonical tag on the new URLs and they do reference themselves, not the old URLs. Therefore I see no reason to be concerned about this issue. It takes time for Google to revisit those old URLs, see the redirect, and update their index. In time the old URLs will drop off and any links going into them should begin counting toward the pagerank of your new URLs. HOW.Ever... You have dozens of geotargeted doorway pages that Google probably doesn't like, or that at least violate their guidelines. If there was an office in each location it would be the right thing to do, as you would include the geo-specific address and phone number. Since every page has the same phone number and presumably there is only one office, you are running into the same problem many other "local" businesses have had to deal with over the years. Unfortunately, there still isn't any real solution and you will have real trouble ranking in the local/maps area on Google. What to do about this is beyond the scope of this question, but if you're going to work with another SEO on this I'd recommend one who has experience with service-oriented business with multiple locations. This page would be a good place to start, and I have pre-filtered it to show only "local search" experts. Good luck!

    | Everett
    0

  • All my clients are impatient with Google's crawl.  I think the speed of life on the web has spoiled them.  Assuming your site isn't a huge e-commerce or subject-matter site...you will get crawled but not right away.  Smaller, newer sites take time. Take any concern and put it towards link building to the new site so Google's crawlers find it faster (via their seed list).  Get it up on DMOZ, get that Twitter account going, post videos to Youtube, etc.  Get some juicy high-PR inbound links and that could help speed up the indexing.  Good luck!

    | Ikusa
    0

  • OK never mind, I found a redirect checker and it says 301 so I guess that's good.  I still don't really get why it's useful to anyone to have this link in the comments area.

    | jeff_amm
    0

  • As Frankenstein always says "Thin, duplicate content bad!" As Matt Cutts says "Comprehensive, unique content good!" You need to configure the catalog with unique titles and descriptions for each product page and make sure that each product page is information rich - great descriptions, pictures, reviews etc.  You will need to spend some time on this.  Possibly pay an intern or someone on oDesk to work in this for you.  Also, the navigation through the product pages in the catalog needs to be clear, a single path through it for bots so that they do not get in an infinite loop of search results and filters to thin search result pages.  If you have any search/ resort/filter options for search results you need to nofollow links to those pages and noindex those result pages. If you already have an online catalog and this new one is duplicating the old one, you need to consolidate the two and get as much unique helpful and easily accessible content as you can on the site. View this video on pagination and it can lead you down a good path of how to setup http://googlewebmastercentral.blogspot.com/2012/03/video-about-pagination-with-relnext-and.html Cheers

    | CleverPhD
    0

  • Hi guys, I see this as well for a client we have in a competitive space. It's a little difficult to understand, because it tends to apply to only particular pages of a domain (please check by using site:yourdomain.com and view all titles) There seems to be no correlation at all. It could be a conversion / click through test for Google.

    | toddmumford
    0

  • Hi, thank you for your response. I think that was nearly what I needed. I used this one here: RewriteEngine On RewriteCond %{HTTP_HOST} !^(www.)?lillymeetslola.com$ [NC] RewriteRule ^(.*) http://www.lillymeetslola.com/$1?domain=%{HTTP_HOST} [R=301,L] Now every request on every domain but NOT the mein domain should be redirected. Correct? Can you check that with one of my alternatives if this works properly? http://internationalmakeupschools.com/ Thanks

    | KillAccountPlease
    0

  • Thanks for your suggestions. Unfortunately their budget is limited so I don't think it will be possible to hire a writer. I will definitely take a look at de-indexing in the meantime as they work through the content.

    | bobbygsy
    0

  • Hmm, this might actually be a non-issue. I used http://tools.seobook.com/server-header-checker/, but if I check it with Firebug it correctly returns a 404...

    | zeepartner
    0

  • Hi Project Labs, In order to better understand the scenario, we would want to know if they received a link warning within Google Webmaster Tool, or just "assessed" themselves as having penalty due to steep ranking drops. The easiest way to answer this is to comment on your points: Over 2,500 linking domains. It is very difficult for even medium size businesses to achieve 2500 unique links, so this number might have raised a flag from Google in their niche - especially if it deviated outside of the aggregate "norm" for this niche / keyword set. Dozens of high quality linking domains like Huffington Post and Mashable. This helps authority, but quickly can be trumped by negative factors.... Some off topic guest post links, e.g. on a SEO site. I would recommend they contact these webmasters and have these ones removed. Guest post anchor text was usually their site name which is an exact match domain. EMD is a more difficult issue now, since EMDs by default do not rank as well. Because people like to link to the site by name, sometimes the exact match can be a serious problem. Is this website owned by a business that has a unique name? Example - Cheesepizza.com - owned by Cheesy Pizza Dynasty. If this is the case, the owner should switch to using the unique brand name. In some cases there is value in creating a unique name, and building business brand signals if they do not currently have this. Lots of top 100 resource pages that received good organic links. Excellent, they would want to check for relevancy, co-citation (shared links to other websites and their main topics) Infographics with links using their domain name as the anchor text. I would remove those EMD anchor texts and move to url only, or Alloneword.com OR the second brand / business option. Relatively few spammy links according to Open Site Explorer. Overall their site's links were engineered but using tactics that most would consider "white hat." I don't think they violated any Google Webmaster Guidelines. Why were they penalized? Anything engineered will be, or has been penalized, or demoted, and I wouldn't expect this to change. Internal links and other areas of SEO / trust signals aside, I would recommend studying the top 20 in this niche and thoroughly analyze their backlinks: 1. Rate of build 2. Exact match vs Commercialized anchors vs brand vs branded vs other? 3. Percentage of 'engineered' links vs natural, editorial. 4. Union of links competitors share in top 5 positions in Google. Hope this helps!

    | toddmumford
    0

  • Actually, it is well documented that it does pass link juice similar to a 301 redirect. You can't 301 redirect Blogspot blogs because you don't have access to the server files.

    | ProjectLabs
    0

  • Tables render slightly slower than divs, but it's not the end of the world. In general, you should avoid tables unless you're working with actual tabular data, but if tables make your life easier they're not going to have much impact on SEO, if any.

    | TakeshiYoung
    0

  • Hello Miguel, As long as you don't stuff this keywords too much in your page it's okay. Basically, after Penguin 2 update there isn't a priority for more pages with a high density of keyword stuffing. You have to focus more on quality and unique content. Hope it helps!

    | Rephael
    0

  • I agree with Takeshi. As long as you write for authoritative sites then you can only increase your own authority. If and when Google decides to launch Author/Agent Rank then you would have written a lot of good content on a lot of different authoritative websites. This is definitely a good thing. Don't worry about the long contributor list. Would you think any less of a book writer who has written dozens of books?

    | WesleySmits
    1

  • Hey Segafredo, Don't you think these URL's are a little bit long? There must be a better way to organise your URL structure. www.domain.com/shop/coffee/beans/brand for example. You could leave out the second 'coffe-beans' and the 500gr could be an attribute of that product. This way it won't be in the URL. It would make your URL's easier to remember. Makes it look less 'spammy' and just look a lot cleaner. You would still rank for 'coffee', 'beans' and 'coffee beans'. Your name is Segafredo so if you are really from the major company Segafredo then the short URL's would be a lot better. A big brand like that would get visitors and links by itself and messy URL's would do your reputation no good. Hope i was of assistence. If you have any other questions let me know

    | WesleySmits
    0

  • Writing unique and deep content about the subject. For example, If you are targeting the long tail keyword: 'cool buildings found in new york city' and 'the best subway stations in the world.' you should write really good content about the two subjects. Make sure to try to grab multiple long-tail keywords around a certain topic and make sure the content is valuable to your visitors. Not just content created purely for ranking on certain keywords.

    | WesleySmits
    0