Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • Anthony, thanks so much for chiming in. My gut instinct is to agree with you. I think I'm going to sit on this one and think about it for a little bit. It will also give me some time to formulate the best approach if I do decide to ask him to change the links to "nofollow."

    | danatanseo
    1

  • Hey Paul, that makes complete sense. Thanks for your help.

    | IceIcebaby
    0

  • Use Screaming Frog to see what can be spidered and where from.

    | danwebman
    0

  • Hi there, The authorship is based on the personal page not the business one. I suggest you get some followers (or groups) on your personal page and use that a bit. Business page is more of a rel=publisher tag.

    | GPainter
    0

  • Referring pages is the amount of pages (URLS) pointing to your site. Total backlinks is the total amount of links you have to your site. Example: 10 pages have 2 links to your site. Referring Pages: 10 Total Backlinks: 20

    | William.Lau
    0

  • Hi Marty, Thanks for your help. Sure - in the site auditor under Meta, Content and Semantic is where it is pulling this new link.

    | webbmason
    0

  • Hey Fuel Interactive -- To answer your specific question, I would use the meta (noindex, follow) tag instead of the canonical. It doesn't pass authority, but it is the correct usage. It will strike the page from ranking consideration, and allow the overall course page to have less competition. Another question for you: is the article listing page already ranking higher than the overall course page, or is this a worry? If it's a worry, I recommend testing it out first so you don't prematurely optimize. Just a thought. Hope that helps -- Andrew

    | AndrewAtMGXCopy
    0

  • Thanks for the update, Conor - glad to hear you've sorted it. Best, Mike

    | MikeTek
    0

  • Hi Tim, I think you should do the migration all at once. Make sure you 301 every page or at least the most important ones. This video might help you: https://www.youtube.com/watch?v=r1lVPrYoBkA Magento is great platform, but if not well implement it became very slow. If you 301 your old pages for a much slower page or a page with very different content you might have problems. As long you make sure you content is aligned and your mageto site is fast you'll be fine.

    | Felip3
    0

  • Odd - the exact page is http://www.evo.com/alpine-ski-boots/full-tilt-drop-kick.aspx. But there is no Schema markup and the OpenGraph image is of the yellow boot and so are the product images… The image they appear to be showing is from the “Others Liked These Similar Items” from the lower right nav… [image: full-tilt-drop-kick-ski-boots-2014-black.jpg] Which is not Schema or OG coded…

    | Digital_SEO123
    4

  • It's good practice, especially if you are operating a CMS that can create accessible URLs that cause duplicate content problems, create "junk" pages, etc. For example: http://www.asos.com/robots.txt Google dislikes search results pages being indexed, so you can block those off, e.g. http://moz.com/robots.txt You can disallow the archive.org bot if you don't want old versions of your site appearing in its search engine, and as others have said you can point to your xml sitemap. It's not a bad resource to have at your disposal for site hygiene / maintenance reasons, but it's not an absolute necessity either.

    | JaneCopland
    0

  • Thanks for the feedback. I agree, it seems to be google specific. Scratch that, webmaster tools is showing manual action. Now the fun part...

    | BTeubner
    0

  • Thanks for the response. Google isn't just indexing the pages, they're sending a significant amount of traffic to them as well. Of course, not all 1M pages are the same quality, so would it be better to have a sitemap of just the highest quality pages and see if Bing will index more of that list?

    | Porch
    0

  • Radically changing rankings are normal for a new site, or a site with little authority. Established sites with good off-page optimisation rank more steadily, holding the same or similar rankings week after week. We would need to see your site to decide if you had an over-optimisation problem, but I find that unlikely - the ranking issues you describe sound more like the site lacks strength. Generally, social media sharing, bookmarking and forum promotion are not enough to make a big difference, especially in more competitive markets.

    | JaneCopland
    0

  • Great job, Mark! I can see from this end that nearly all of those unwanted URLs have already dropped out of the results. That's far quicker than even I expected! And the ones that aren't gone are leading to a 403 Forbidden page, which is great. One last thing you can do if you want. Because you are on HostGator, they are displaying their custom 403 error page, which has their branding all over it (nasty, kinda ugly) You could create your own simple 403 error page, add your own basic branding to it, and for instance add a line that says something like "You don't have permission to view this page or it is blocked for security reasons. Drop by the home page [link to home page] to find what you're looking for, or to conduct a search." This basic page can be used to replace the one that HostGator provides by default so any visitors that hit it by accident will still feel like they are on your site, and will have a suggestion for what to do next. Your hosting control panel will have instructions for how & where to provide your own custom error pages. Hope that last little tweak's useful. Paul

    | ThompsonPaul
    0

  • Absolutely get those 301s into place as soon as possible Beth! Not only will you likely see some increased traffic from links that are out there to the old pages, but you'll also likely see a nice rankings boost. Right now, any links to the old pages are essentially "lost" to your site for ranking influence purposes. Getting the redirects in place will allow that ranking influence to again be credited to the client's new pages. When you do start adding the redirects, make sure to add an Annotation to the related Google Analytics profile. Depending on the number and quality of the redirected pages, and on whether the site's 404 page currently has Analytics tracking, you're going to see a bit of a shift in engagement metrics. If there's no tracking on the 404 page, you'll see an increase in visits as visitors land on "real" pages instead of the 404. If there was 404 tracking before, you'll see a decrease in Bounce Rate and increase in pages/visit as far more visitors stick around the real pages instead of just bouncing from the 404 page. You'll want to be able to refer back to the date the redirecting started so you'll always be able to put stats changes into context around this process. (e.g. a year form now when the client is trying to figure out why there was a site improvement around this time) [Hint - make sure you've got solid 404 page tracking in Analytics and keep checking it as you go along. It's an essential addition to just watching for what's showing up in Webmaster Tools, for example.] Some more suggestions for the process: Use Analytics to track improvements in the metrics you expect to benefit from this process. This is how you'll demonstrate the benefit of the work, and get credit (and therefore reputation) for your efforts. You can even set up Goals around the expected improvements to make them easier to track. Use Screaming Frog, Xenu Link Sleuth or equivalent tool to run a check of all internal pages to ensure none of your own pages include broken internal links. Screaming Frog (paid version) can also be used to bulk-test your redirects immediately after implementation. Watch for any high-value incoming links to old pages that you think you might be able to get corrected at source (i.e. an external site you have any sort of relationship with). Since each redirect wastes a bit of "link juice" you're even better off getting the original link corrected to point to the right page, instead of having to go through the redirect. Only worth it for strong links. Watch for opportunities to use REGEX to combine several redirects into one rule. Fewer rules is better for site speed. If you don't have a copy of the original site to extract the URLs from, you can use the Wayback Machine to see a version of the site form before the migration. to create a list of the old URLs that are still indexed, use the site:mydomain.com search operator to find the majority of still-indexed URLs. You can then use the SERPSRedux bookmarklet to scrape all the results into a csv and use Excel filtering to find all the old URLs (tip - set your Google Search results to show 100 results per page to make the scraping faster) Set up an ongoing and regular process for checking for and dealing with such 404s. Any site should have this in place, but especially one that has been redeveloped. Lastly, since you know you've got a lot of 404's coming in, make certain you have a really top-notch 404 error page that is designed to capture as many visitors and possible and help move them to real content without losing them. Again, important for any site, but well worth extra attention for any site that knows it has a 404 problem. (This is far better than "soft 404ing" to a home page, for example, for a number of technical and usability reasons.) So bottom line on "whether this is worth my time and effort?" You better believe it is. probably one of the best things you could do for the site at this point. I have direct experience doing this for several sites and the improvements are significant and quite gratifying - both for you and the site owner. Hope those are useful ideas? Paul

    | ThompsonPaul
    0

  • Oops, good catch Paul, you're correct!

    | MichaelC-15022
    0