Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • Must've been a Webmaster tools issue - tried the whole process again and this time Webmatser Tools verifies the site with the 301 redirects in place

    | cmscss
    0

  • This will depend on what slideshow plugin you are using. If you respond with that information it will be easier for someone to give you a definite answer, and I will try to do so myself. Edit: for what it is worth, I am seeing the alt tags on your photography website.

    | AlexMcKee
    0

  • Hi, I just checked my existing Disavow.txt file and the SEO company we hired to put us in this mess we're in had previously added secretsearchenginelabs.com to it.  I should have checked that file before i posted my question.  My apologies and please disregard.

    | McCaldin
    0

  • Looks like its not just us - this is starting to get more attention https://www.seroundtable.com/google-cache-dated-19497.html

    | AndyMacLean
    1

  • yes thank you guys, that's how I thought about it as well and just wanted to clarify

    | _Heiko_
    0

  • /index.html should 301 redirect to the root domain (www.example.com/index.html and example.com/index.html should redirect to www.example.com) For multilingual, you will need to set up hreflang for each language via tags or sitemap. Check out https://support.google.com/webmasters/answer/189077?hl=en for more info.

    | OlegKorneitchouk
    0

  • Got it! will PM you the problems accordingly!

    | MoosaHemani
    0

  • Unfortunately, the answer is "it depends". I do have some recent experience with this for 2 very small sites (one has around 300 indexed URL, the other has around 70), which you may find useful. In each case, it took just a day or two to get the most important URLs (best rankings, traffic, link authority, etc.) swapped in for their non-https counterparts. However, deeper URLs with little link authority took up to 90 days to be swapped out. If your most important URLs don't get swapped out in a week or so, I would check these things: Make sure you've updated internal links so that they point to the https URLs. You don't want to pass your link authority through 301s anyways. Make sure all versions of the site are verified in GWT, setting the https version as the preferred version. Make sure your sitemaps (XML and HTML) contain the https versions of your URLs Make sure that the https URLs do not have the non-https URL's set as the canonical version. Hope this helps and good luck!

    | MChuckGreen
    0

  • Hi Mick, One more question if you don't mind, how would this affect my traffic results? Could this be why my direct is off so much? Or would it show in my organic traffic? My Direct traffic used to be 3x or 4x by Organic or Paid traffic, now it's barely 1/10th. Cheers.

    | b4cab
    0

  • Hi BajaSEO, I would recommend that you start here: http://moz.com/blog/pigeon-advice-from-local-seos Pay particular attention to Linda Buquet's comments and then follow them here: http://localsearchforum.catalystemarketing.com/google-local-important/21886-pigeon-analysis-new-insights-about-crazy-google.html I believe this is what you are asking about, and I suggest you read that whole thread on Linda's forum. It's 5 or so pages long, but I recommend you follow through on the whole thing.  It's the best conversation I have seen regarding the differences between the classic and new Google Maps and reading it should at least confirm what you are experiencing. The results are in flux, likely due to rotating data centers, but as to when this would change, I'm afraid only Google can say.

    | MiriamEllis
    0

  • as you said, there is no benefit or nearly zero benefit. It would look much better if you had a .com level domain rather than using a subdomain.

    | webtheoria.com
    0

  • Hi Yeah, lets say they use Xrumer. They hack your site, insert pages of their own, and links on your pages. They put those urls in text files based on their keyword targets/groups. They run the software, using those list with their link sources and using their auto insert random url template. So that pings a 404 to GWT so the 404 shows up there. If these are pros, they already know that the pages are dead by now, as they confirm links after each run. It just takes a bit more time for GWMT to get notified so you'll see them trickle in. So you'll see those 404 pages getting links from different dates. Hope that helps

    | DennisSeymour
    0

  • If they were recently randomly generated by some errand code, you can let them die and 404. The only time you want to redirect a page to another is if that page 1) is getting traffic or 2) has backlinks. Since you're dealing with a code error, those two are very unlikely and you can be confident in just killing the pages.

    | EricaMcGillivray
    0

  • I'd also recommend WordPress. It's an easy to use interface with tons of themes to choose from. The CMS creates clean URLs and you can install the Yoast plugin for free… it is a great SEO tool that covers all bases

    | adamxj2
    0

  • Hi Dave, Just add the https version to webmaster tools. Then wait. That'll fix all your issues. The collected data from the old http though wont show in the new one. It's completely normal for that to happen so dont worry. Hope that solves your problem

    | DennisSeymour
    0

  • The_Sage answer is excellent in my opinion. Personally I am a modern user, but the large majority of the visitors to websites I manage are not. There are few ways of checking what kind of visitors you have using google analytics: https://www.google.it/webhp?q=google+analytics+users+scroll

    | max.favilli
    0

  • If it is the case that no URL for .us should exist (there are not new URLs) then you can remove pretty swiftly in Webmaster Tools >> Google Index >> Remove URLs >> select the root URL and select to remove all directories that come from it.

    | MickEdwards
    0

  • The only danger is if you disavow the wrong links. As long as you are careful not to include legitimate links in with the bad ones in the disavow file you should be fine. Best practices involve doing a link audit and sorting things out, then contacting webmasters about taking down the links, then running the disavow. Given the apparent clarity that this was an external negative SEO campaign I would consider skipping the second step and including a comment regarding your theory on the origination of the links in the disavow file... but I've never tried that. I've only been in situations where the disavow was based on misguided previous seo efforts, so we went through the removal step. Hopefully someone else can jump in on that.

    | JFA
    0

  • I was aware that the mobile icon was being tested and was resently confirmed to be rolling out. As such and as per Leonie, you may have not seen it in your browser as only the odd test saw it. Going forward this will likely be visible to all. A bit more can be seen in these posts. https://www.seroundtable.com/google-anti-mobile-friendly-icons-19282.html https://www.seroundtable.com/google-smartphone-icon-19233.html https://www.seroundtable.com/google-smartphone-icon-15532.html also a bit on mobile friendly ranking factors here https://www.seroundtable.com/google-mobile-ranking-algorithm-19463.html

    | TimHolmes
    0