Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Howdie, Yes, I believe we got this sorted out. Interestingly, it wasn't any of the suggestions made here causing the 301 status code responses. I posted a thread in Google Webmaster Tools Forum regarding the issue and received a response that I am 99.5% sure is the correct answer. Here is a link to that thread for future readers' reference: https://productforums.google.com/forum/#!mydiscussions/webmasters/zOCDAVudxNo I believe the underlying issue has to do with incorrect handling of a redirect for this domain:  ccisound.com I am currently pursuing getting it corrected with our IT Director. Once the remedy is in place, I should know right away if it solves the issue I am seeing in the server logs. I'll post back here once I am 100% certain that was the issue. Thanks all! This has been an interesting one for me!

    | danatanseo
    0

  • It's going to be hard to give an exact answer as the sites have now changed.  But, if your rankings dropped a month ago I'd be looking at Panda as a culprit.  There was a Panda update that officially happened May 20 but many people saw changes a few days before that.

    | MarieHaynes
    0

  • Providing an alternative way for search engines to access content that is not otherwise available to them, but is clearly available to anyone who "sees" the page is a legitimate use of the Display: None tag. Here is what Google has to say about it: http://www.seroundtable.com/google-hiding-content-17136.html .

    | Everett
    0

  • Yes your landing page most likely does not have as many backlinks as your homepage. I think you would have to get some powerful back links going to your landing page ideally you would use interlinking from your more powerful pages to your relevant pages that lack links. If you are fooling around with the content of your website trying to get one page to rank over another https://marketing.grader.com/report/www.myoptimind.com/overall http://tools.quicksprout.com/analyze/http%3A%2F%2Fwww.myoptimind.com%2Fweb-design-services-philippines#!/ Dynamic URLs were found on the page. Contact Us - http://www.myoptimind.com/?page_id=105 There is a problem with your site is not a responsive website at all it will not fare well on mobile devices. The size of the landing page is over 3 MB making it unfortunately very slow as well. I do not know the speed of your site in the Philippines however you can use a program like this webspeedtest.org Host your site from a server that is close to your target market. If your server is far away for good reason use a CDN and optimize it. Here is a hosting company in or near the Philippines your site loads in 9 seconds out of Argentina and the second pass when it should be compressed and cash it runs 8 seconds http://www.webpagetest.org/result/140623_3C_681/ Load TimeFirst ByteStart RenderDOM ElementsTimeRequestsBytes InTimeRequestsBytes InFirst View 9.928s 1.296s 0.000s 4649.928s 781,518 KB 16.225s 1161,691 KB Repeat View 8.671s 1.150s 0.000s 4648.671s 781,596 KB 14.808s 1171,767 KB fix this by optimizing your site http://www.feedthebot.com/pagespeed/ http://www.terremark.com/data-centers/latin-america/ http://www.duplika.com/hosting-premium

    | BlueprintMarketing
    0

  • This is excellent advice. Thanks for your input, ForForce!

    | Christy-Correll
    0

  • Google cannot read ALL JavaScript in every situation. Yet **they most definitely can read at least some of it in some situations. ** The article quoted was from 2012. A LOT has changed since then. A May 2014 post up in the Google Webmaster Central blog mentioned how they can read it / execute it sometimes, while others they can't, though they do not go into specifics. A direct quote from Matt Cutts: “Once that JavaScript has all been loaded, which is the important reason why you should always let Google crawl the JavaScript and the CSS – all those sorts of resources – so that we can execute the page,” he continues. “Once we’ve fetched all those resources, we try to render or execute that JavaScript, and then we extract the tokens – the words that we think should be indexed – and we put that into our index.” source: Webmaster Video April 7, 2014 Oh - and if you block that JS in your robots.txt file, Google MAY respect that, and Google MAY ignore that. While Robots.txt files USED to be a firm "directive", nowadays they're "just one more signal / hint". Do not assume that in your unique situation Google will or will not find that content, they will or will not see it as duplicate content just because they "may" or "may not" be able to figure out your JavaScript. Want to avoid that insanity?  Yeah - generate the content you contain in those pop-ups on actual unique URLs that get loaded in those pop-ups.  Then, on those specific URLs, do a noindex,nofollow meta robots tag, AND a canonical tag pointing to the main product description page where that content appears in its primary form. Or write entirely unique content for those pop-ups.

    | AlanBleiweiss
    0

  • It's not a plugin... It's available in a base wordpress install.  Use  in your posts: http://www.wpbeginner.com/wp-tutorials/how-to-split-wordpress-posts-into-multiple-pages/

    | ForForce
    0

  • Thanks for the thorough response! Ecommerce sure does add a level of complexity to the whole process.

    | Ray-pp
    0

  • Thanks Prestashop. I'll look into that and let you know.  Appreciate the advice

    | Prime85
    0

  • We don't use any eCommerce platform with fully developed in-house. Was more about whether it was good to no-index the search results, which you have answered. I will go about putting this in the robots.txt

    | Lantec
    0

  • Pleasure is all mine my friend. You are most welcome. Moz SEO community is an indispensable asset and weapon in any SEO's inventory in my opinion. We learn a great deal here while helping others. I am really thankful to each and everyone here on Moz community. Long live Moz and Mozzers. YOU ROCK!!

    | Devanur-Rafi
    0

  • Thanks guys. I feel quite confident that i have implemented the sitemap correctly (kinda wish i had done it through metas in the head now though as I think it might get indexed quicker this way)! It a relief that you both say this could take a couple if months the index correctly and yes I will be doing some promotion for it

    | AndrewAkesson
    0

  • Sorry for the cliffhanger, but I did some more research and wanted to follow up. After looking over the mobile site's backlink profile, I'm convinced it's not related to the mobile version having a better backlink set than the desktop (mobile backlinks = 4, desktop = 1,400). The overall quality of the backlink profile on the desktop version does not send up any red flags for me (or any of the software used to analyze). Mostly industry related Q&A links, local business listings, fairly low on general directories (about 3%), other hospital links (client is in the healthcare industry) No manual penalties were ever issued to the desktop version. I appreciate your feedback as well Bridget, the rel=alternate & rel=canonical for each version of the site appear to be appropriately inserted based on Google's guidelines, so I'm still scratching my head trying to figure this one out.

    | Etna
    0

  • You're throwing out the baby with the bathwater (to use a colloquialism). External pointing, followed links are not only good for the web as a whole, they're good for YOUR site, too. We've seen numerous examples of sites that began opening their external linking policies and received greater search traffic and rankings as a result. The most famous of these in the NYTimes, which Marshall Simmonds talked about in his Whiteboard Friday here: http://moz.com/blog/convincing-upper-management-aka-justifying-your-existence-whiteboard-friday I'd also suggest watching Cyrus' video on the topic of linking externally here: http://moz.com/blog/external-linking-good-for-seo-whiteboard-friday And finally, I'd point out that sites that never link out with followed links create the perception that they are not generous and thus, not deserving, of links of their own. You might point out that only a fraction of web users know what a nofollow link is, and my response would be that those are the same people who control most of the websites and links. All in all, I'd strongly advise against this (and Google does, too!).

    | randfish
    0

  • Hi DakotahW, Our store front is Volusion based as well and we recently began using Moz to aid in our efforts to improve our site SEO. We are experiencing the same problem with duplicate page content. How did you go about correcting this issue? Do you have any advice on how we can avoid these errors?

    | PartyStore
    0

  • Are you absolutely sure there isn't something else going on?  Usually Google is pretty good at picking up this type of thing and just discounting the links.  Plus, if the links were generated to pages that didn't exist then they wouldn't be passing any signals (i.e. penalty) to the site.  In fact, removing pages is a way to remove links.  (See http://searchenginewatch.com/article/2296653/Removing-Unnatural-Links-by-Removing-Pages-on-Your-Website) You mentioned it happened about 6 weeks ago...I'm not sure if it fits, but 4 weeks ago there was a very large Panda update that upset a lot of site's rankings.  Any chance the drop happened around May 20? Have you checked WMT and sucuri.net for signs of malware?  Often when pages get injected like that it comes along with malware. I'd also look for things like accidental noindexing or blocking pages with robots.txt.  I had three sites this week consult with me because they thought they had penalties and two had lost their analytics code and the other had accidentally noindexed the majority of their site. If the ranking drop is indeed due to the bad links then you really should recover now that they've all been disavowed.  But I'm guessing there's something else going on.

    | MarieHaynes
    0

  • Yeah that's definitely an issue we've had in the past (php execution time). It times out so quick for loading anything. Hopefully it's just a server switch issue that we can upload directly as our robots.txt and sitemap.html aren't showing up either. Thanks for leading me in the right direction. Very helpful.

    | IceIcebaby
    0

  • Hi Jesse, I could see why some people would argue that you should have a consistent design across all your templates on the site including your homepage. The homepage is in most cases the most linked to page on your site and based on that has the highest authority. By having some important links not on there it could be that you're not passing on some of the value/ authority that the homepage has.

    | Martijn_Scheijbeler
    0

  • The easiest way is to rebrand as a completely-original word or phrase so that people would only ever write (or search for) that word or phrase in reference to your company. The other major way would be to use schema code on your website to "tell" Google exactly what you are: a business in Sweden (and not a town in Minnesota) so that there is less confusion on the part of the search engine. I'd start here. Good luck!

    | SamuelScott
    0