Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Latest Questions

Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!


  • Thanks everyone!

    Link Building | | PeterM22
    1

  • http://edgesoftmedia.ro/contact/ Adrian Luzaric is great

    Web Design | | Damien-Anderson
    0

  • If page.html is theultimate goal, then www.website.com/page.html?source=xxxyyy will give link juice to page.html, just make sure you tell webmaster tool site config that 'source' is a parameter and you will have the juice to flow to page.html. I would also add to make sure all the engines index it correctly.

    Online Marketing Tools | | oznappies
    0

  • You are most welcome. Please let me know if it works.

    Technical SEO Issues | | RyanKent
    0

  • I'm pretty sure they are just pulling those traffic volume figures directly from google adwords keyword tool.  You can dump your whole list of 500 keywords in that too, and with the right settings you should get the results you are looking for. One thing to be careful with multiple word keywords is the difference between: keyword1 keyword2 [keyword1 keyword2] "keyword1 keyword2" Usually what i am most interested is #2 above.  So what i do is dump the 500 keywords into a text editor, and replace the newlines or commas, or whatever keyword delimiter you have with a [ and then replace it on the other side with a ] if this doesn't make anysense, you can link your list of keywords and i'll do it for you, probably would only take 2 minutes.

    Keyword Research | | adriandg
    0

  • I see. That's tricky then. I suppose offering currencies for the relevant countries would be a start. Then get links from websites in that country, and build pages for those countries. Thanks.

    Keyword Research | | Benj25
    0

  • Thank you all for the positive feedback. Lately I have made the time for SEOmoz Q&A as I have been doing various SEO research and these boards can be a great way to stretch thought processes.

    Intermediate & Advanced SEO | | RyanKent
    0

  • Getting great links is the way to go, but that can be easier said than done. Primarily you want to attract natural links through the promoting content, but you can build links too. Have a look at these tools: http://www.seomoz.org/labs/link-finder/index.php and http://www.seomoz.org/link-finder

    On-Page / Site Optimization | | SteveOllington
    1

  • While your #3 option is tempting, I can offer a few other ideas: install Microsoft Translate on your site. It is an improvement over Google Translate in a couple of key areas. The widget can auto-detect a user's language and translate pages automatically if you wish. If you site requires users to login, Google Translate will log the user out when a user requests a translation. The logout will occur on any and all pages. MS Translate allows users to remain logged in to your site. while you may not be able to translate to every language, you should determine which language is used by the largest percent of your customers and add that one language to your site as an option. The process may take time, but when it is complete you can then consider adding additional languages. If you can cover Russian, German, French and Spanish you will reach a lot more users. you can also try to simplify your pages. Imagine going to McDonald's in a foreign country. If you see a picture of a BigMac, a bag of fries and a coke then a price, you understand everything without a single word being read. Clearly it isn't that easy to do with your website but you can move a few steps in that direction.

    Paid Search Marketing | | RyanKent
    0
  • This question is deleted!

    0
  • This question is deleted!

    0

  • Thanks for the input. I think I will keep it as one site.

    Web Design | | eldoradoseo
    0

  • The highest authority sites are not always going to be at the top, they could just be top for that phrase but their overall domain authority could be lower than 2nd, 3rd etc. I use Chrome and the SEOmoz tool bar - enable the SERP overlay and it tells you the search results Page Authority and Domain Authority - These are SEOmoz ratings but I find them to be better than Google's page rank.

    Link Building | | Seaward-Group
    0

  • Yes From the Matt Cutts / Eric Enge Interview Eric Enge: Can a NoIndex page accumulate PageRank? Matt Cutts: A NoIndex page can accumulate PageRank, because the links are still followed outwards from a NoIndex page. Eric Enge: So, it can accumulate and pass PageRank. Matt Cutts: Right, and it will still accumulate PageRank, but it won't be showing in our Index. So, I wouldn't make a NoIndex page that itself is a dead end. You can make a NoIndex page that has links to lots of other pages. For example you might want to have a master Sitemap page and for whatever reason NoIndex that, but then have links to all your sub Sitemaps.

    Technical SEO Issues | | RyanKent
    0
  • This question is deleted!

    0

  • Will using http ping, lastmod increase our indexation with Google? No. You can submit a perfect sitemap and ping Google with changes every hour, but that will not increase the number of pages which are indexed. A few good sources discussing sitemaps and indexing: http://followmattcutts.com/2010/03/23/matt-cutts-on-sitemap-indexing/ http://faq.bloggertipsandtricks.com/2010/08/html-xml-sitemap-what-difference-matt.html If you have a site with solid navigation, good architecture and links, then there is no need to use a sitemap. Search engines will determine how often your site should be crawled based on your site's authority. They can also determine which pages have been modified by comparing the header dates with their database. I still use a sitemap, but it's mostly because the process is fully automated. I know of other sites that are well indexed which do not use site maps at all. With the above understood, I'll try to offer a bit more information directly related to your questions. When you ask about pinging, I presume you are referring to mainly Google and Bing. For those cases, the answers to all four of your questions is NO. Listing your sitemap location in robots.txt will help other search engines whom you did not ping to locate your sitemap. This can include the SEOmoz crawler, for example.

    Technical SEO Issues | | RyanKent
    0

  • The tag is to prevent Microsoft Smart Tags from rendering.  It was only included in beta versions of IE6, and no other browsers since.  It's not worth including; you can safely delete these tags. An explanation of smart tags on wikipedia. The same question on stackoverflow.

    Technical SEO Issues | | john4math
    0

  • Even though the recipes are the same, I doubt they'd be penalized for having duplicate content.  If the search engine decides that it's duplicate content, it may filter out one or the other, and may knock one page down in the search results.  If your first page of results was all the same page from different domains, the search engine would have failed you.  The duplicate content penalties you're thinking about are more for spammers and spam content, where they're duplicating a lot of content, and doing it intentionally. In your example, there is a lot else on the page other than the ingredients and the recipes, so that also should mitigate some of the duplication, and they're also laid out a bit differently in the HTML between the two pages, one with divs and one with tables. Sources: [Google Webmaster Central Blog '08](Google Webmaster Central Blog '08 ) High Rankings Advisor (Jill Whalen) '10

    Technical SEO Issues | | john4math
    0

  • I'm working to remove low quality pages from a directory while at the same time allowing a few high quality pages in the same directory to be spidered and indexed.  To do this I placed a robots noindex tag on the low quality pages we don't want indexed. This noindex tags where implemented yesterday, but the low quality pages aren't going away.  I even used "Fetch as Googlebot" to force the crawl on a few of the low quality pages.  Maybe I need to give them a few days to disappear, but this got me thinking: "Why would Google ignore a robots noindex tag?"  Then I came up with a theory.  I noticed that we include a canonical tag by default on every page of our site including the ones I want to noindex.  I've never used a noindex tag in conjunction with a canonical tag, so maybe the canonical tag is confusing the SE spiders. I did some research and found a quote from Googler JohnMu in the following article:  http://www.seroundtable.com/archives/020151.html  It's not an exact match to my situation because our canonical tag points to itself, rather than another URL.  But it does sound like using them together is a bad idea. Has anyone used or seen canonical and noindex tags together in the wild?  Can anyone confirm or deny this theory that the canonical screws up the efficacy of the meta robots tag?

    Intermediate & Advanced SEO | | davidfricks
    1