Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Thank you Cyrus I will certainly read the blog post and consider the noindex, nofollow on content with a canonical tag that differs from the current served page' uri. I am still at little confused as to why the SEOMOZ crawl is highlighting duplicate pages when the canonical tag is present and pointing to the primary content. Take the following example page for example:- http://www.planksclothing.com/planks-classic-t-shirt-black-multi.html Firstly the page has a canonical tag. There is no search on the site and product is viewed a root level without directory structure, which in a Magento instance is the common problem with duplicate content... Currently at the time of writing SEOMOZ is updating my duplicate repor, so I can't find out what is the duplicate content. Maybe it is updating to say it is not Thanks Amendment: After reading the supplied blog post (http://www.seomoz.org/blog/duplicate-content-in-a-post-panda-world) I have learn't that the above page is just not different and probably is in the area of "Thin Content".

    | Flipmedia112
    0

  • Your best bet is to do a 301 redirect on it. I would look at OSE to see which one has a higher PA and more links and redirect the other to it. Also redirect the www or the non www to a single destination.

    | webfeatseo
    0

  • I have unique content, i only copied one small article which i mentioned above, and regarding links, yes i am building links, but i am building links based on my competitor links data. And is there any chance for me to know what exactly has took my site back and what i need to do in order to get back its position. Tomorrow i am gonna ask my copy writer to write more content for it. And i will fix the issue with content. And do you guys really think the problem is also with link data? As i have no chance removing some of the links. And also there are no warnings in GWT.

    | Dex32434432
    0
  • This topic is deleted!

    0

  • Thanks Ade Lewis for your answer.

    | seoug_2005
    0

  • Hyphens work without problem and are not considered spammy, as long as other spam signals are low. I just ranked a domain with two hyphens, from nowhere in the search engine positions, to positions 5 on page 2 in 6 days with as little as a dozen backlinks and 30 pages of good content, for a search volume of 100 000.That's on google.fr but it applies to google.com as well imho. Check a simple search such as "Investment Banking" and see for yourself : how many domains on 1st page have hyphens ? I see "careers-in-finance.com" as #5. Matt Cutts once said "hyphens are treated as separators". I would also suggest you to watch this video : Matt Cutts about branding versus keyword laden URLs : http://www.youtube.com/watch?v=rAWFv43qubI&feature=related However in the example, I would go for taxbond.net because it is simpler. And of course buy tax-bond and redirect. I would not buy the pluralized search though, as Google would always prefer the simpler version, all things equal.

    | iung
    0

  • Hello Bill, Thanks for coming to Q&A with your question. The NAP is really the key, more so than the website. For the business to be able to treat each specialty as distinct, it would need to become 4 distinct companies, each with a unique legal business name, legit physical street address and local area code phone number. This scenario would enable the owner to have a unique Google Place Page for each of the businesses, instead of just one Place Page for all of his specialties (as well as having unique listings in all of the other local business indexes). As things currently are, he is permitted to have only the one listing per index. This is the case for most businesses like that of your client and by building out his content on his website, you are doing pretty much what you can do for his organic campaign (plus linkbuilding, social media, video etc., of course). The tough thing about clients like this one, is that they typically not only offer a menu of very varied services, but they also tend to serve in a number of surrounding cities. So an SEO/Local SEO campaign typically looks something like this: 1. Get the client listed in the major local indexes. 2. Campaign for reviews in a variety of sources. 3. Get citations for his Google Place Page 4. Build out a body of service-related content on the website. 5. Build out a body of geographic content on his website. 6. Build links every which way 7. Engage in additional forms of marketing that will be most effective at reaching the client's audience (email, video, social media, blogging, etc.) Now, in entering into all of this work, the client must be informed up front that his chances of ranking above the fold of Google's results are mostly going to revolve around his services in his city of location, in that he may achieve grey pinned local results for these 'service + geo' terms. He may not be able to expect top rankings for all 4 services. In any service city where he isn't physically located, the client should be made to understand that he is most likely to have to rely solely on the organic rankings below the local results, as Google will be viewing his competitors with physical locations in those cities as most relevant. Clients like these are more complicated than, for example, a dentist with an office in Denver. But, that being said, there are substantial benefits to engaging in the work. Even lower rankings for terms can lead to trickles of monthly traffic and if these convert to phone calls and bookings, it has all been worth it. Good luck!

    | MiriamEllis
    0

  • Hi Catline, Many thanks for your response. I'm not quite sure what the purpose would be, with my limited knowledge of mobile SEO, I assumed that for the mobile site to rank well on mobile search results and outperform the desktop site on mobile I would need links to the site. Since posting and reading more I think my approach has changed, so I would rely on the rankings for the desktop site to be listed on mobile search results and then use a user-agent redirect to direct mobile users to the mobile site. If I understand correctly, I would use the canonical tag in the mobile page to point to the desktop version of the page. Then the desktop page should always outrank the mobile version... Thanks

    | MarkChambers
    0

  • So, when being linked to ... or even mentioned without a link something like... "Dell Personal Computers" or "Xerox Copiers" I tend to agree with you at face value only I have done back-link analysis on the competitors coming up as brands and stores and I have many many more links like this than they do on quality sites (not being discounted) I think search query volume might play a part in all of this ... anybody have experience with that?

    | tatermarketing
    0

  • Well that is cloaking, and against google guidlines. You are showing the search engine somthing different to the user. There is no need for flash, the home page is text, why not format it as html and then users will not have to wait. See what google has to say about showing flash to users and html to bots http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355

    | AlanMosley
    0

  • What's to stop google from finding them? They're out there and available on the internet! Block or remove pages using a robots.txt file You can do this by putting: User-agent: * Disallow: / in the robots.txt file. You might also want to stop humans from accessing the content too - can you put this content behind a password using htaccess or block access based on network address?

    | DougRoberts
    0

  • Hi Rasmus, Thanks for your reply. Google has updated the index but the destination page seems to have suffered from the redirection. To answer your question I have redirected all the pages to the deep page of SportyTrader.com as the number of page was low and relevant to the deep page.  I have choose to redirect this website as I wanted to concentrate my efforts on SportyTrader.com. The 2 website are mine. Do you see the reason why the 301 does not help ? Thanks

    | jarnac
    0

  • Hi Fraser, If you don't want to create a separate website for Australia, then I'd suggest you get yourself a .com, then create sub-folders to target the markets which you wish to enter - e.g. yourdomain.com/uk/ for the UK; yourdomain.com/au/ for Australia and so on. You can then target these sub-folders in Google Webmaster Tools. I hope this helps, Hannah

    | Hannah_Smith
    0

  • No its not. the rank for the old domain is not transfered to the new domain using frames. calling and old page may cause it to be loaded in a frame located on the new domain, but a 301 will do this and transfer the link juice also. So the 301 wins hands down every time.

    | AlanMosley
    0

  • Adam151 I am unaware of any methodology around predicting results in SEO. The reasons behind this are myriad, not the least of which is the fact that while you are improving your page/domain, any good competitor will be doing the same to some greater or lesser degree. So here is my suggestion for you: First, go through your site and insure that your on page SEO is as perfect as is possible. Look at what your competitors link profiles and see what links you can pick up. You can look at the 2011 Search Engine Ranking Factors on SEOmoz. Knowing that most in SEO believe this to be about what most impacts your rankings, you could devote your time to each on the basis of their effect. Such that, Page level link metrics and Domain level Link authority features take up nearly half of your resources. I suggest running a campaign and throwing in your top three competitors so that you have a good idea as you move as to what is taking place. By tracking what causes the moves, you will no better going forward. Look at what is being considered more and more important by the SE's. Things like freshness (QDF) and being sure your local is full on perfect. Remember there is more than Google and insure you are also in Bing Business Portal and in Yahoo Local. With all of that, then tracking the changes for impact, you might later have a better predictive model. Best

    | RobertFisher
    0
  • This topic is deleted!

    0

  • You are correct they have little value, It would be a good idea to place them on your own site instead, only that they are probably copied onto many other sites by now by screen scapers and you would not get credit for being the original. You can try copy scape to see if they have been copied http://copyscape.com/ no dont point links at them, point the links at your own site instead.

    | AlanMosley
    0
  • This topic is deleted!

    0

  • I think you need to be clear that by Meta-tags, you're referring to meta-description (and title). I agree, getting these right can make a real difference. With a compelling title and meta-description you can get more than your fair share of traffic, even if you don't rank #1! All you need to do it get the searcher to believe that they are going to find what they're looking for! Meta-Keywords is probably best ignored. I believe there is a risk that Bing make use an over-stuffed meta-keywords tag as a negative signal.

    | DougRoberts
    0

  • I'm not 100% positive, however it does make sense to use it this way. User-agent: * Disallow: /*_Q1$

    | lonniea
    0