Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • Something in the title you sent triggered a thought and after checking I realized you're dealing with a .co.uk domain.  I have found the .co.uk Google to be far more tolerant of heavy keyword use and even link spam so you're probably in a battle with folks who are indeed keyword stuffing or worse and finding yourself having to do the same just to keep up. It's a bit of a slippery slope but I will admit that even some recent work I did in the UK required a slightly more heavy handed approach to SEO than I'd typically do.  So while I wouldn't recommend it in the US, the title you're suggesting will probably work well in the UK. Cheers ! Dave

    | BeanstalkIM
    0

  • Hello Joshua, What you are describing is nothing to be concerned about. It is a completely natural process when content is being created for there to be some form of reciprocal linking. This is especially true in list-pieces such as the one you are describing. There is no real need to avoid linking directly to them, and certainly nothing to worry about with regards to their social media accounts. What Google is trying to get away from is people creating websites to link to each other using the same hosting or from the same webmaster. This is what leads to penalties. From their perspective, you are all (likely) on separate hosting, you all have different webmasters, and you are clearly recognized brands that are completely separate from one another. This is the kind of article they would want to see show up and is unlikely to create any unwelcome attention. The links you receive will have plenty of value, assuming you are not being linked-to extravagantly over and over from the same domain. It's totally normal to see a couple of pages on a single domain link to another, but it gets to be spammy when you begin seeing 10's, 100's or even 1000's of links coming from a single source. What you are describing is normal content creation - something Google has been adamant about for years. I don't think there's anything for you to worry about here. Best of luck with the launch! Rob

    | Toddfoster
    0

  • Agree with the points above with one exception.   Yes, you have to find a way to deal with duplicate and quality content at scale.  Yes, Robots.txt, nofollow links and index sitemaps are your friends.  I would not use rel=canonical unless I had to.  Better to get those extra pages de-indexed and then not let Google crawl the urls with the extra parameters to start with.  Why waste Google's time in crawling pages that are just resorted versions of another?   If you use the directives wisely you probably "only" have 200,000 pages worth crawling if you have that many sort parameters. Good luck!

    | CleverPhD
    0

  • Right. Chain redirects = bad. However, in the same video of Matt Cutts, he does say that the overall amount doesn't matter, and that's what I was talking about in first part of my previous answer. Now, let's crunch some numbers to show you that the number of no-chain redirects doesn't matter. Assume that we are in perfect world, so all given manufacturer given numbers actually right and all operations per second are actually operations per second Lets say that standard hosting server is 2GHz power = 2*10^9 computations per second Since all htaccess work/computations are strictly on a server side (bots/browsers just send request to server for response if page should be redirected), the only time which can slow down the request is server response time. Match computations are always considered low computation power processes. so, let's say you have htacces with 1 000 000 redirect rules, server keeps it in memory to do match computations when bots make requests, it means that 2GHz server has to have 2000 requests per second to just START struggling. So, do you have 2000 requests per second to your website and 1 million redirect rules? P.S. All number above are very rough approximations P.P.S. If you really wanna see if your server is/ would struggle - login into web host manager, go to server status and info, look and see how much of your server power is usually being used. Usually that number is lower than 6-7% at 90% of the time. Hope this clarify some things

    | seomozinator
    0

  • Hi, Could you clarify what you mean with "none of the images are indexed in Google Webmaster"? Did you submit an image sitemap? If you do an image search - it seems that the correct images are found. Most important factors for image search is having correct Alt text (seems to be ok) a descriptive image name (could be improved - you use spaces in the filenames - would be better to use '-') you also provide a correct alt & title on the link to the bigger image so that look ok as well. you could add a caption to the image to make it even more obvious what the image is about. A good guide on image optimisation can be found here Hope this helps, Dirk

    | DirkC
    1

  • Completely agreed with Moosa. you can also check below post. http://blog.woorank.com/2013/03/a-guide-to-clean-urls-for-seo-and-usability/ Hope this helps. Thanks

    | Alick300
    0

  • Hi Marina, I have come across issues with redirect plugins on WordPress when trying to deal with http and https. They can have bugs which can cause redirect loops or other issues causing your page not to load. When you say your design is messed up do you also get a warning about the page containing insecure elements or similar? This often happens when moving a site to https - you may find this plugin helpful for dealing with this issue - https://wordpress.org/plugins/ssl-insecure-content-fixer/ - then you can choose to have your site fully https if desired. In terms of the redirect what server are you on? If you are on an apache server you can easily take care of this redirect by logging into via FTP and then editing the .htaccess file. The following code will allow you to redirect the whole site to https if you fix the layout with the above: RewriteEngine On RewriteCond %{SERVER_PORT} 80 RewriteRule ^(.*)$ https://www.domain.com/$1 [R,L] replacing your www.domain.com with your domain. I personally would go down this route and have your whole site on https once you have dealt with design/content issues - but in order to help with your decision take a look at this great post from Cyrus Shepard - https://moz.com/blog/seo-tips-https-ssl *Note many sites have now moved to https - look at Moz for example. Hope this helps! Matt

    | Matt-Williamson
    0

  • Hey Marina, Yes, this method will work out without affecting your root domain. This question has already been answered at the following threads, please refer them for more details: https://moz.com/community/q/how-to-remove-an-entire-subdomain-from-the-google-index-with-url-removal-tool https://moz.com/community/q/dev-subdomain-pages-indexed-how-to-remove Hope this helps! Umar

    | UmarKhan
    0

  • There's not a whole lot to be done about other sites getting hacked, unfortunately - even if you did find the program/network that is creating the bad links, you might not be able to tell which competitor of yours is doing it, and even if you did it would be difficult to prove or get them to stop. Your client's site may even not be the real target - link spam networks will often target a variety of random sites in addition to their real targets to cover their tracks. The good news is that Google is pretty astute at ignoring these types of links. It's highly unlikely (though not impossible, unfortunately) that your client will see a link penalty from this malware. Here's what I would do in this situation: Contact as many of the hacked sites as you can/want to, and let them know they've been hacked (that's just a courteous thing to do, as it's likely the site owners have no idea, but it's up to you how much time you want to spend on this). Be proactive about disavowing as many of the spammy inbound links as you can find (you're probably already doing this, I'm just saying). You could probably do this at a domain level since it sounds like the linking sites aren't related to your client's site in any way. Take a little extra time in the next few months to build some high-quality links to your client's site to make their link profile a little extra shiny. Taking steps to guard against any damage/fix any damage that may have been done will be a better use of your time and effort than trying to figure out who's doing it and why, IMO.

    | RuthBurrReedy
    0

  • in order to determine if your website is hacked this is one of the best tools I know of both to find out and to remove the malware. https://sitecheck.sucuri.net/ In order to determine rather not you have on-site SEO problems on a very technical and granular scale I would use https://www.deepcrawl.com/ $80 a month you cannot go wrong another amazing tool and it's free for the first 500 pages and if you want the added features which you do or more pages only about $150 a year is http://www.screamingfrog.co.uk/seo-spider/

    | BlueprintMarketing
    0

  • It is not unusual to redirect a whole website and all of its pages, so it wouldn't make sense that you could have too many. And you should keep the redirects indefinitely--what if there is a link to the expired page? You do not want to lose that equity by having it go nowhere...

    | Linda-Vassily
    0

  • It definitely can, I saw once a site that had hundreds of redirects and it was definitely slowing the site down as they had to run through all the redirects first on the server side to see where the user should be forwarded to.

    | Martijn_Scheijbeler
    1

  • Hi Rand, Thanks for taking the time to answer my post. It was actually my Moz analytics campaign which flagged up the issue - I so understand they're guidelines, rather than definitive answers. I think you've cracked it for me though! A true gent - your moustache would be proud. Where my site differs is in it does not link to the individual posts and shows all content from those blog posts in full on the category page. I'm now going to go away and (get someone much cleverer than me to) implement changes in the way of snippets of blogs linking through to the full blog articles. Thanks again for the help.

    | sbridle
    1

  • Did  you validate both the www & the non-www version? It could take some time before the links are taking into account. Apart from that, GWT is not exactly best in class to show the links to your site. This the reason why Open site explorer, Ahref,...etc exist. Dirk

    | DirkC
    0

  • Hi Cal, URLs are in my experience a very small part of the ranking algorithm.  Data from Moz's recent Ranking Factors Correlation Study seems to back this up - see https://moz.com/search-ranking-factors/correlations.  As such, I wouldn't worry too much about the URL structure when making decisions like this. That said, I'd tend to lean towards the "Default" option you provided because you said you were writing blog posts, so /blog/ should show users what to expect from this page a bit better than the "Alternative" option.  If I'm a user, and I see the URL http://chocchip.com.au/services/website-design/medical-clinics, I expect to see a services page, not a blog post. Just my two cents - feel free to message me if you need any more help Mark

    | Mark_Ginsberg
    0

  • Hello, did you know how to fix this?

    | pentagramamx
    0

  • Hi Eddie, This URL (www.widgetcompany.co.uk/widget-gallery/coloured-widgets/red-widgets) is really looking spammy. Second URL is perfect you should go with that URL for the keyword "red widgets". Thanks

    | Alick300
    0

  • This is what I would do, too: search for a large chunk of text from the page and see if the page comes up. Site: is not always 100% accurate.

    | RuthBurrReedy
    0