Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • Hi, Do you have different sites targeting different countries? Or do you have a main site but with different folders or subdomain for different location?  If it is the former, you should purchase a new ccTLD that targets that country.  For example, if your site targets people in UK, you should buy a .uk domain.  If it is the latter, you should include rel="alternate" hreflang="x" tag and use geotarget.  These two methods will help you avoid duplicate content issue. I believe the bigger sites that you mentioned are doing this thus they are still safe from search engines and are ranking pretty high. Google understands sometimes you just can't rewrite something and thus they offer the rel="alternate" hreflang="x" tag.  Check out the following article from Google Support http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077

    | TommyTan
    0

  • Hi Nikolas, I wish seo is like 6th sense to me.  I believe you have more traffic from Google to your Squidoo page in the past and not your site is because Squidoo has higher authority than your new site.  Squidoo has more backlinks and content under Squidoo's domain and as you might know backlinks is an important factor in SEO.  Furthermore, your site is still new.  Squidoo has been around for a while and Google also look at the domain age as a ranking factor.  I don't think you can simply create a new site and hope to get as much traffic as Squidoo.  You will need more time, create more content and earn some links.

    | TommyTan
    0
  • This topic is deleted!

    0

  • Google's guidelines and seomoz toolbar indicate that anything over 100 links is to much. But if you check out very well known sites , there internal linking structure is over 300 (take amazon for an example). That is 1 of over 200 factors for ranking. If its helpful for the user experience leave it , if its not get rid of it.

    | NikolasNikolaou
    0

  • Hi Matthew, thank you for your response. All urls have backlinks and generally site wide there are plenty (3 million+). This has been going on for some months now and all experts we have had on it had no explanation. Matt Cuts should have a support # for this

    | Crunchii
    0
  • This topic is deleted!

    0

  • Hi Jason, What you have right now is a continous loop.  With the canonical tag you are telling Google that the old page is the preferred page you want to show; however, then you have 301 redirecting to the new site.  It will only confuse the Google bot and won't help you website at all. Solution: Remove Canonical Tags from the new pages.

    | TommyTan
    0

  • You're never going to be able to 100% control who's linking to you.  As long as you have a diverse link profile comprised primarily of links earned from developing useful, quality content you won't likely be penalized. That said, I WOULD evaluate the link profile to determine if it's heavily weighted from sites that Google is scrutinizing i.e. Directories, Low Quality Article Marketing Sites, etc. and determine where you have opportunity to develop social worthy content to elevate your profile.

    | Aggie
    0
  • This topic is deleted!

    0

  • Redirections are not entirely positive from a (google) SEO viewpoint. Limit it as much as you can. Think about redoing things and starting from scratch if you are not too far into it. Best of luck.

    | Andropenis_Australia
    0

  • Hi Paul, Pagination is always a bit of a sticky area! Firstly I certainly wouldn't do any user agent detection, you don't wanna get busted for cloaking when you aren't even up to anything that naughty. A nice way i've seen this handled (for wordpress sites although the idea can work on any site) is with the wordpress infinite scroll plugin : http://wordpress.org/extend/plugins/infinite-scroll/ That basically leaves the site as it is for non-javascript web browsers (so with page 1, 2 3 etc.) but if you have js enabled (i.e. not a spider bot) it will keep scrolling the page. This functionality could I guess be changed to create a pagination effect. Tie this is with some rel="prev" and rel="next" markeup (http://googlewebmastercentral.blogspot.co.uk/2011/09/pagination-with-relnext-and-relprev.html) and I think that is certainly one way to fix the problem. Another way could be using similar markup for a 'View All' page : http://googlewebmastercentral.blogspot.co.uk/2011/09/view-all-in-search-results.html Hope that helps! Stuart

    | stukerr
    0

  • Hi Sean, Sure it is. Just run the page through SEOmoz's OpenSiteExplorer - http://www.opensiteexplorer.org/. Put in the exact URL, and change the params to "only external" and you can see all of the backlinks to that page, plus the linking domain's authority, etc. It's a very good tool. Cheers Matt

    | Horizon
    0

  • The thumbs up Dr. Pete, You definitely explain that much better than I could. And completely agree once the 301 in place there should be nothing else associated with it. Teginder I thought I would send this link with a screenshot from Google searching for staplers Google I noticed in your screenshots you are logged in to Google I just wanted you to know if you're constantly searching for staplers and your URL Google will modify the search to suit what it thinks is your needs. Hence I did a very unscientific  incognito check allowing Google to give me a less biased search result. to make it more useful high logged into SEM Rush and searched staplers and received what you can find inside the CVS file for the top 10 organic results. So you know this is what came up In the photographs is different from what SEM Rush and Google are telling me. https://blueprintmarketing.sharefile.com/d/scdb1ed7e9464929b The very best of luck with your new website. Sincerely, Thomas Zickell s969f00518094f02b

    | BlueprintMarketing
    0

  • This is true. The rewrites using [R=301,L] are great when it is necessary to use a regex to perform the 301's with fewer entries in your .htaccess

    | coneh34d
    0

  • Is there a reason you can't use one page for "Internal Decorating" and have users end up on that page no matter whether they arrive via Residential or Commercial?

    | BedeFahey
    0

  • This has been an on-going issue with Google Places and many other local directories. A majority of them are being flagged for spam for several reasons. Some of the reasons that I've found are: Profile not active, verified, or 1st review Over capitalization Short generic feedback (i.e. "They were great! A+") IP address marked from multiple accounts or login location I started noticing this issue when Google shifted all local listings into Google+. I've been following threads in http://productforums.google.com/forum/ but have yet to see anything about changes or fixes to this issue. Some other sites that I've noticed similiar issues: Yelp.com Citysearch.com Yahoo! Local YP.com Thread going since Nov 7 about similiar issue at http://productforums.google.com/forum/#!topic/business/G7RFGxn2cIU I cant seem to find the other threads that I saw before.

    | fishpunt
    0

  • Actually I added it last week, but within a few days it showed up in rankings. I mean, low in rankings, but it showed up pretty quickly. So now I figure if I just claim and Verify this new listing that has already shown up in rankings, we'll get it down quicker and more efficiently.

    | Linwright
    0

  • Hello Zora, yes i have around 11k pages. About search, wow how did i missed it. on other thought that page consists of google search. i am gona change that and see how it will turn out. thanks alot :}

    | wickedsunny1
    0