Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • Thanks Pieter - that is a correct analysis and pretty much the same as my own thoughts.... but I was interested to see if there would be any benefits. I had a feeling it would be hard work to get that domain ranking

    | agua
    0

  • That's right. A soft 404 is still a missing document, but it allows the user to continue through the pages without leaving the website. Tom

    | tomhall90
    0

  • I figured it out. I put a "/" at the end of my 301 redirect and it worked. The other one didn't have a "/" at the end of my URL and it wouldn't forward it respectively. Hope this solves someone else's problem too!

    | timeintopixels
    0

  • Hi Ryan, you've received some great responses here. Did they answer your question?

    | Christy-Correll
    0

  • Thanks Doc! Always good to be reminded that google often changes their mind on how they handle these things. I think I am going to try it.  If things don't work or if things change I can yank it off of the secondary site.

    | EGOL
    1

  • Thank you all for the detailed response. I am going to use the suggested tools,  and get to the bottom of this problem. I will report back when I have some more insight. Once again thank you all for the detailed responses.

    | GladdySEO
    0

  • I think that this issue might happen sometimes because the system probably refreshed the data. You better try to check it at the next day.

    | Rephael
    0

  • It does take some time for Google to remove pages from the index.  You can speed it along a little by using the "Remove URL" link under Optimization in Google's Webmaster tools. You should also disallow the url in your robots.txt file as well, to be on the safe side if you don't want the page to show up in searches. And there is also Google's page about removal: https://support.google.com/webmasters/bin/answer.py?hl=en&answer=59819

    | Nick_Ker
    0

  • The page in question has a bunch of links that are really hyper-targeted. http://www.opensiteexplorer.org/anchors?site=www.commercialmatsandrubber.com%2FAnti-Fatigue-Mats-c5.html I wouldn't worry too much about internal links unless you really go overboard with them. It's very possible that these links are being devalued. It's also possible that the the external links with anchor text are working against you. Disavow was probably the right move, but now you need legitimate external links. Without knowing more re: the questions I asked, I still don't know whether you're being impacted by Panda, Penguin, or both. We haven't talked a lot about Panda in this thread, but you need to ask yourself if you are the best result for users, and whether Google's site quality judgments might be working against you. When people look for "anti-fatigue mats" do they want to see a bunch of text, or do they want to see products? How do the user metrics look for generic keywords? The https warning (some unsecure elements) may be putting some people off, too - it's worth testing. Ultimately the answer is probably going to be the same: you need a new design, and you need legitimate popularity metrics to replace the artificial popularity metrics that worked in the past. Hope this helps.

    | Carson-Ward
    0

  • I would hire someone who really knows their stuff about Penguin and Panda.... and has lots of experience getting penalized sites restored in the SERPs.   That person might know what can be done about your situation. It might not cost as much as you think

    | EGOL
    0

  • The problem with that is maintaining and scaling this structure, now and in the future. From Google Side, its ok to use any of the versions, as if you do a search "Voitures à vendre a paris" you see sometimes that the word "cars" and even "autos" is highlighted in the url. So Google is doing a pretty good job detecting translated versions or even transliterated version. Cheers

    | wissamdandan
    0

  • I can confirm chris is correct, our crappy CMS does the same thing (for certain pages), and the pages are not indexed

    | PaddyDisplays
    0

  • ok thanks Chris but its not something to do with Yoast (platforms wordpress) ? the instruction would just be that getting warning signs in rich snippet testing tool and needs to be fixed ! Also just to confirm this is nothing to do with authorship (since that tested fine) & its just other structured date detected on the page ? cheers dan

    | Dan-Lawrence
    0

  • that page seem fine, and it also indexed by google, so I'm not sure whats the story with that.  Might be best to contact seomoz support (help@moz.com I think)

    | PaddyDisplays
    0

  • @Don - I mentioned that you didn't say how many links you had but you did say you had a problem of too many on-page links. There wasn't much detail about the links, hence the question about where you were getting the data from and Andrew's question too. Perhaps should have left off the "the numbers you mention aren't that high"  - it was reference to the numbers of websites you mentioned and the thought that there wouldn't be that many links from just one page to your other sites.

    | Houses
    0

  • Thanks Chris We have several affiliates that use banners that are stored on our servers, I have been told in the past you can not nofollow a link like [image: 470x80.gif] as it is not an <a href=""> is that not the case?</a>

    | MotoringSEO
    1

  • Hi Everett, Thanks for the clear and concise information, and for the additional helpful tips! The text to write on the page is interesting, although I might not use it since I am not sure how helpful it is for UX as it pushes the things they were looking for (images of jewelry for sale) further down, and possibly below the fold. The internal links for the gold category were changed quite a while ago, since we found it more helpful for users to have two separate categories. External links were seen from the mozbar. The problem is that I had a similar problem with another KW which was in top 10 and then fell out and now the ~ #11 result is a completely different page (product page not category), while the original category page that was showing on SERP is MIA. I don't think it was over-optimized, since it fell out before many of the optimizations were done, and there were no significant changes done to the original page. Any other ideas on how to solve, or how to best prevent (as much as a webmaster can) it in the future?

    | Don34
    0

  • This is a short response but you should make sure that you look at where you are in the SERPS: 1. Logged into Google 2. Not logged into Google (that's key!) 3. Using different countries' search engines since your rank will probably vary between US and CA.

    | seoagnostic
    0

  • In my experience it can take a while to see sitemaps updated, particularly if you are a small site and are updating it daily.  Unless your site really is changing everyday, I would submit it less often and/or just be patient.  You didn't indicate how long you've been waiting, but at least a few weeks to a month or two if your site is new or "small".  If there is a problem, your Webmaster Tools will let you know. I agree with Crusader as well.  Sections should be broadly included in the sitemap...specific individual pages aren't really a concern with this (assuming you link well to them).  You are only meant to provide the structure of the site to Google (main section, sub sections, subdomains, etc.) and let them come index when they are ready.  But I do think you probably need to move onto other efforts and be patient for Google to get around to visiting your site.

    | Ikusa
    0

  • Do you have/have you submitted any video sitemaps?

    | DougRoberts
    0