Latest posts made by ATMOSMarketing56
-
Hreflang for Canadian web visitors (when their browsers are set to en-us)
We're in the process of implementing hreflang markup for Canadian & US versions of a website.
We've found that about half of our Canadian traffic has browsers that are set to en-us (instead of en-ca, as would be expected). Should we be concerned that Canadians with en-us browser settings will be shown the US versions of the website (as the hreflang would markup 'en-us' for the US version of the page).
Our immediate thoughts are that since they're likely to be searching from Google.ca and would also have Canadian IP addresses, that this won't be an issue. Does anyone have any other thoughts here?
-
RE: What on-site issue could be causing Moz to not detect internal links?
Thanks for your response. Here is some more information about the issue -
Are your link hidden in flash or java? **No **
Is the page an iframe? No
A little more information about this site:
- Is indexed by Google, successful Screaming Frog crawl
- Recently converted from http to https (sitewide)
-
What on-site issue could be causing Moz to not detect internal links?
Hey guys,
We've done a crawl and none of our internal links are showing up. Are there any on-page factors that would prevent Moz from being able to detect our internal links?
Thanks!
-
Why is a site no longer being indexed by Google after HTTPS switch?
A client of ours recently had a new site built and made the switch to HTTPS. We made sure to redirect all of the HTTP pages to HTTPS and submitted a new sitemap to Google. GWT says the sitemap was submitted successfully but only 4 pages have been indexed where there should be over 2000. This has led to a plummet of organic traffic and we can't find the issue. Has anyone else had issues/success with doing a HTTPS switch that knows how to fix this problem?
-
RE: Can PDF be seen as duplicate content? If so, how to prevent it?
I think you can set it to public or private (logged-in only) and even put a price-tag on it if you want. So yes setting it to private would help to eliminate the dup content issue, but it would also hide the links that I'm using to link-build.
I would imagine that since this guide would link back to our original site that it would be no different than if someone were to copy the content from our site and link back to us with it, thus crediting us as the original source. Especially if we ensure to index it through GWMT before submitting to other platforms. Any good resources that delve into that?
-
RE: Can PDF be seen as duplicate content? If so, how to prevent it?
What about this instance:
(A) I made an "ultimate guide to X" and posted it on my site as individual HTML pages for each chapter
(B) I made a PDF version with the exact same content that people can download directly from the site
(C) I uploaded the PDF to sites like Scribd.com to help distribute it further, and build links with the links that are embedded in the PDF.
Would those all be dup content? Is (C) recommended or not?
-
SEO for Online Auto Parts Store
I'm currently doing an audit for an online auto parts store and am having a hard time wrapping my head around their duplicate content issue. The current set up is this:
- The catalogue starts with the user selecting their year of vehicle
- They then choose their brand (so each of the year pages have listed every single brand of car, creating duplicate content)
- They then choose their model of car and then the engine
- And then this takes them to a page listing every type/category of product they sell (so each and every model type/engine size has the exact same content!) This is amounting to literally thousands of pages being seen as duplicates
It's a giant mess. Is using rel=canonical the best thing to do? I'm having a hard time seeing a logical way of structuring the site to avoid this issue.
Anyone have any ideas?
-
RE: Keyphrase ranking a geo-redirected site in Google
You could just rel="canonical" to all the US pages you want to out-rank the UK pages.
-
Proper Hosting Setup to Avoid Subfolders & Duplicate Content
I've noticed with hosting multiple websites on a single account you end up having your main site in the root public_html folder, but when you create subfolders for new website it actually creates a duplicate website:
eg.
http://kohnmeat.com/ is being hosted on laubeau.com's server.
So you end up with a duplicate website: http://laubeau.com/kohn/
Anyone know the best way to prevent this from happening? (i.e. canonical? 301? robots.txt?) Also, maybe a specific 'how-to' if you're feeling generous 
-
RE: How to promote good content?
-
It's an information resource that was built with the intention of being of value to prospective salon owners. (similar to how the seomoz guide is of value to prospective SEO's). The reader will likely not be an immediate customer, but may remember us down the road when they are ready to purchase our products. (similar to readers of seomoz guide).
-
It will be hosted on the site we sell our salon-related products from. The goal is to use it as link-bait. Links are top priority.. traffic is just a bonus.
Best posts made by ATMOSMarketing56
ATMOS Marketing of London, Ontario is an online sales and lead generation company. We focus on increasing the amount of sales and leads by generating targeted traffic and increasing conversion rates.
http://www.atmosmarketing.com