You can PM me directly if something is sensitive. However, if the community can also respond I prefer to have all questions available to the public.
Posts made by Ray-pp
-
RE: Optimizing for 3 international sites, how to avoid getting into trouble
-
RE: Question spam malware causing many indexed pages
Sometimes search engines do not update as frequently as we would like them to. Since you've already verified that these pages no longer exist on your site, I would also suggest that you actively try to have them removed using your Google Webmaster Tools account.
Google source: https://support.google.com/webmasters/answer/1663419?hl=en
-
RE: Optimizing for 3 international sites, how to avoid getting into trouble
Well, that' an understandable position to be in. Unfortunately, I cannot speak directly to the following:
"I was also told by our current SEO company that google only penalises sites that have duplicate content in the same localised areas...."
I'm guessing your developer means country specific areas and not necessary city/region, which would be susceptible to penalties in this situation.
I would really like to hear from someone else in the Moz community with more direct experience on this international issue - hopefully someone speaks up soon

-
RE: What is the best way to generate an automatic sitemap for google, bing and yahoo?
With Screaming Frog you would need to crawl, generate, and submit the sitemap (or upload it to your server) whenever it changes (you add/remove a page).
-
RE: Create a report with keyword, label, difficulty, global search volume, and ranking?
Glad to see you got most of the reporting needed.
You may also want to submit a feature request over here and see if the community and Moz team can plan to implement your feature.
-
RE: Optimizing for 3 international sites, how to avoid getting into trouble
As your sites are currently configured, you do run the risk of a duplicate content penalty. Your site's appear to be the same, except for the URLs (same content across each domain name).
I'd start by asking why you need to have country specific domains. A .com ranks very well internationally. I understand the importance of having the unique domain names from a trademark perspective, but if the content will not be unique I suggest having them all 301 redirect to the .com domain. If possible, have the site translate to the country language when choosing from the international drop down, rather than visiting the country specific domain.
This cuts down on a lot of the maintenance needed for multiple domains and prevent duplicate content penalties.
If you want to keep the domains separate and have the content be mostly duplicate, you would need to look into cross-domain canonicalization. From what I saw, you do not have any canonical tags on the site at this point. Plus, we need to remember that canonical tags are only a suggestion for the search engines to follow and they may choose not to adhere to the tags you put in place.
-
RE: What is the best way to generate an automatic sitemap for google, bing and yahoo?
Hi Alec,
It looks like your website is a custom built website, correct? No CMS like WordPress or similar. In this instance, you would want your developers to create an XML sitemap that auto-includes the new pages created for your site. If this is not possible, then you need to update the sitemap with the new pages as you create them.
Once you submit the sitemap link to Google / Bing they remember the URL to your sitemap and crawl the sitemap at their leisure (more popular sites get their sitemap crawled more frequently). So, once you submit a sitemap, as long as the URL remains the same, the bots will auto-crawl your sitemap for updates.
If you cannot have a sitemap autogenerated and do not want to always manually add the links (or do not know how to maintain an XML feed) then I suggest using Screaming Frog SEO Spider's sitemap feature. The tool will crawl your website and you can generate a sitemap for your site. Then, you can submit that sitemap as it changes to the proper search engines.
Since your websites are setup as unique domains, you would need to submit a sitemap with the unique URLs for each domain. Your other question about your 3 web properties raises some concerns, which I will try to address there.
-
RE: Help I deleted my container in Google Tag Manager!
Hmm, tough spot you're in there. When you create a new container in GTM it assigns a unique container ID for that container - I don't think there is a way to revert back to an already deleted container, unfortunately.
-
RE: Create a report with keyword, label, difficulty, global search volume, and ranking?
The keywords would need to match exactly to make sure the Vlookup works properly. If you're tracking keywords that are not in the difficulty list / vice versa then the Vlookup would pull a N/A value, which would let you know the keyword is not on the list.
-
RE: Help I deleted my container in Google Tag Manager!
The container is placed manually on your website. If you want to remove a Google Tag Manager container you need to go into your websites template files and remove it accordingly.
-
RE: Create a report with keyword, label, difficulty, global search volume, and ranking?
Hi promfgsystems,
A report like that is not available within the Moz tools with all the information in one place. However, is you're familiar with Excel then combining a few reports together to create that dashboard is not very difficult with the help of Vlookups.
-
Download the Tracked Keywords report - This will give you...
-
Keywords
-
Labels
-
Ranks
-
Download the Keyword Difficulty and SERP Analysis report - This will give you...
-
Keyword difficulty
-
Bring all your keywords to the Google Keyword Planner Tool - This will give you...
-
Global search volume (Moz cannot shot Google search volume)
Then you would put all of these reports into a single Excel workbook and use a Vlookup to put the information into 1, simple report.
-
-
RE: Do I eventually 301 a page on our site that "expires," to a page that's related, but never expires, just to utilize the inbound link juice?
Hi Paul - You're correct from the start, you want to 301 those dead pages to their closest related page. Maybe the sports team or sport category. From what you've written, I'm sure you can figure this one out - nice job!
-
RE: Dilemma about "images" folder in robots.txt
I recommend allowing Google to crawl those images. Google optimizes its crawl rate and once it has done a complete crawl it will understand how often to crawl certain areas of your site. My main concern would be that you are losing potential rankings and indexing from those images - if they are unique and high quality you definitely want them to index the images, understand the file names, and appropriately index them.
I wouldn't be concerned about Google bot eating up your server resources. If it does become a problem, then you can go back and adjust the bot access through the robots.txt, as you've done already. However, I would let them in first and only react if it becomes a problem.
I have tens of thousands of product images accessed by the google bot and it is no concern to my ecommerce company and the server resources. I'm not saying that it can't be a potential problem, but the benefit outweighs the risk of it being one - I choose a reactive stance in this situation.
Closely monitor your Google Webmaster Tools account, watch the crawl rate and statistics, and if it becomes an issue then decide on which image folders should or shouldn't be indexed.
-
RE: Moz crawler finding my homepage multiple times
Hello Keenan-price,
Welcome to the Moz community!
Moz is reporting these duplicates correctly. Each of the listed URLs are seen as unique URLs and unique pages. This is a common problem when a website does not have the proper canonical tags and 301 redirects in place for these URLs.
You'll want to decide on how your website should be displayed (which URL you prefer) and implement the canonical tag and 301 redirects.
the 301 redirects could be done with your .htaccess file, depending on your site environment. The canonical tags would depend on your site's environment (wordpress, custom development, ect).
Also, make sure to go into your Google Webmaster Tools account and specify a single page as being the correct page, once you've decided on how you want the URL to be displayed.
-
RE: Page Authority Migration
PageA.html's links still exist in the wild and haven't been updated to reflect the new page. When calculating the page authority, we still see all the links pointing to pagea and it calculates an authority for the page, although it has been permanently redirected to pageb.html.
Then the calculated authority is transfered through the 301 redirect. But, since all the external links exist, you won't see that PA vanish completely for pagea.
-
RE: How much copy do you need on the homepage?
As much as copy that is needed to add quality information to your readers and clearly communicate your value proposition.
The industry has moved away from 'a minimum of 300 words is needed' to focus on the quality of content provided to the readers. Branding also influences a home page - some brands prefer a clean, minimal interface and already have the traction (word of mouth, referrals) needed to keep themselves sustained.
I recommend adding enough content to your home page that clearly explains why someone would want to use your product and the benefits they will receive from using the product. If I can get that from your home page, then it is a successful user experience.
Of course, keep that content targeted to your main topic / terms, but don't think that optimizing only the major on-page components will keep you high in the rankings.
-
RE: Adwords Advice
I like to target a major city and then use location bid modifiers to increase/decrease the bids in the appropriate locations. If your ad copy can be generic enough and still successful, then it could save you a lot of time by targeting London and modifying the bids for successful cities in the immediately surrounding area.
If, however, those smaller cities have enough traffic then it would be best to set them up with their own campaigns to have the highest level of targeting and focused ad copy possible.
I agree with David's suggestion of setting up location specific campaigns and including the services offered as ad groups, with the addition of bid modifiers as appropriate.
-
RE: Adwords Advice
Generally speaking, if you setup your AdWords account to match the Site's Navigation structure, then you get a clean and manageable AdWords architecture. I almost always setup my client accounts by matching their navigation structure. If that structure is confusing, then the site's navigation structure is probably also confusing and needs to be optimized.
-
RE: Guest Posting Campaign For New Site
The best way is to provide high quality, value-added content without spamming keyword rich anchors back to your blog. Guest posting, when done properly, can still be an effective link building strategy if it doesn't minimize the content.
For example, find a highly related niche website and write up an insightful, lengthy article (none of that 300 word crap). In that article, touch briefly on a related topic that you've expanded upon in detail on your own website - like an aside note where a reader can visit to gain deep understanding on a supplemental topic. If both articles (the guest post and your own article) are unique and high quality, the contextual link should be a positive addition to your link portfolio.
You want it to be seen as natural and appropriate - not spammy and an obvious tactic at only receiving a link. Keep the readers' best interest in mind and you should be ok, as long as the quality of the website you're guest blogging on also executes legitimate link building strategies.
What topics are you intending to write about?
-
RE: Do three way links still help in moderation?
If you guys create a simple link network between one another, then no it probably won't help much and you're risking being seen as manipulating links.
Instead, I suggest each writing up quality content that your site's focus on and provide a contextual link to one another. This way you are giving your users valuable content and if the article focuses on each of your own niches it then makes sense to provide a link to the site.