Latest Questions
Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!
-
Using "Div's" to place content at top of HTML
Thanks guys. That was what I thought. But, it is always good to get some conformation.
Technical SEO Issues | | tdawson090 -
What is the best practice for h1 titles for products inside categories? long tail or short tail?
I'd make the breadcrumbs Home > Running Shoes > Purple Pumas and the title "Purple Puma Running Shoes". You have to think about what the customer needs when they land there. What if they have a screen reader? How do customers look at breadcrumbs? They're going to be looking to the title for information about what the product is when they come in from a search. Most of the time, brand names are only going to get you so far in customer's recognizing what something means with that brand.
Technical SEO Issues | | EricaMcGillivray0 -
Weebly vs Wordpress for SEO?
Thanks Matt for your answer! Let me ask you this in regards to all the elements that are needed in SEO I was wondering if these were possible when using a Weebly interface: is it possible to 301 redirect a non weebly domain to a weebly domain? are canonical tags available on weebly? are url structures problematic? for instance I read that urls skip subdirectories such as www.example.com/weebly instead of www.example.com/hosting-reviews/weebly? Thanks for letting me know!
Inbound Marketing Industry | | Ideas-Money-Art0 -
Changing commenting platforms...will deleting old comments hurt SEO?
Is you current commenting system allowing FB logins? Because in my experience what users like about FB commenting is just the ability of commenting without first need to signup. I would stick with your current commenting system and just allow social logins, if you don't have already.
On-Page / Site Optimization | | max.favilli0 -
A 302 Redirect Question | Events Page Updates
Will, I'm not familiar with the CMS you're using, but to answer your question about rel=canonical, no, that is not an instance of when to use that tag. Canonical tags are used for times when duplicate content is unavoidable, such as sorting a product category page and having different URL parameters based on the sort type.
Intermediate & Advanced SEO | | LoganRay0 -
If other people copy your content, is really GOOD or BAD for SEO ?
HI Juan, It's a bad thing. Will's and Bryan's responses are great. My personal experience provides another side to this. Years ago I ran an ecommerce site in a smaller market. Business was good, and then I started getting these random emails about orders that I could not find any record of. I spent literally hours going through documents and the database trying to figure out what was going on and came to the conclusion I was getting trolled. But these people were insistent they just wouldn't let up, I was stealing from them and they were threatening me with lawsuits and what not. About a month after this started I found the problem, 2 copy cat websites had popped up and copied some of my content word for word, markup for markup, resulting in pages on their site with links to "my contact page". Turns out the people did have an order they were just complaining to the wrong company. Though I did convert a couple of these people into customers, the time I spent looking up erroneous orders and responding to emails made it a bad experience. My thoughts, Don
Branding / Brand Awareness | | donford1 -
Site Category structure detrimental to SEO?
Search engines tend to reward sites that have more comprehensive pages, so I tend to think these category pages are detrimental. Besides creating too many pages with very little content, they're all competing amongst themselves and probably appear as duplicate content - there's likely to be more HTML for your header and footer on the page than actual content. From a user experience, I'd also personally get frustrated clicking through that many times to get to what I want only to find there are so few products in that category. Or if I hopped straight there from a Google search, onto a page with so few products, I'd be likely not to stick around long. I'd rather see more and narrow it down myself. I would suggest doing one of two things: A - Filter dynamically. Instead of having all these as permanent pages, have the TV category as the last permanent page, and use checkboxes to filter down. That way you're going to a dynamic URL, not a static one. So basically, your customers get the benefit of seeing the set of products they want, but it's not a permanent "page" on your website that would get indexed. Depends heavily on what technology you're using as to how difficult this would be to change. B - If you can't filter dynamically, it might be easier for you to add meta noindex tags, or update robots.txt, to block everything below the TV category. You'll still have pages where customers can see their narrow set of products, but the meta tags or robots.txt will tell spiders not to link to pages that far down. As a side note, special characters in URLs are not a best practice, so I'd get rid of the ampersands if possible.
Technical SEO Issues | | WebElaine0 -
Search visibility increase with international SEO
Well on mine first answer there is number for 54M population of hispanic/latin in US. But you can't know how many of them do searches in english or in spanish? I can talk about Bulgarians. Even if they migrate to other country they still talk in Bulgarian in home, watch Bulgarian TV, read online Bulgarian newspapers, purchases Bulgarian goods. And more interesting - they still search in Bulgarian. Example - even if google.co.uk you can get Bulgarian searches and results. Real case - a friend of mine own TV repair service center and get phone call from London about TV repair. Just lady's there want to find someone to fix his mother TV. Funny - distance between service center and home was almost 100-200 meters. You don't know what you don't know...
International Issues | | Mobilio0 -
2 sitemaps on my robots.txt?
We recently changed our protocol to https We have in our robots.txt our new https sitemap link Our agency is recommending we add another sitemap in our robots.txt file to our insecure sitemap - while google is reindexing our secure protocol. They recommend this as a way for all SEs to pick up on 301 redirects and swap out unsecured results in the index more efficiently. Do you agree with this? I am in the camp that we should have have our https sitemap and google will figure it out and having 2 sitemaps one to our old http and one to our new https in our robots.txt is redundant and may be viewed as duplicate content, not as a positive of helping SEs to see 301s better to reindex secure links. Whats your thought? Let me know if I need to explain more.
Technical SEO Issues | | MonkStein0 -
Is sitemap required on my robots.txt?
Hi Juan, You should also know that you can have multiple sitemap directives on one robots file. This is common among international sites and large commerce sites.
Technical SEO Issues | | LoganRay0 -
Ranking with subdomain - Urgent
Hello, To answer your question, yes it is possible to rank sub-domains. In general they tend not to rank as well as root domains but they can and do rank. Depending on the difficulty of keywords, you'll have different results. I think your going to have more challenges using Wix in general than other CMS sites. While I don't have any personal experience with Wix, I have seen many post here about problems with ranking, and more often with reporting.Google "Moz SEO Wix" to see some of the other questions and concerns raised on these forums. Wix uses Javascript to load data to your webpages. The problem with this is that Javascript is client side script which basically means the page is loaded after you get to a page, while crawlers expect the content to load from the server (server side). This is basically a way to say "most" crawlers / spiders will not be able to read your site. Which makes reporting, and SEO particularly difficult. That is not saying your site will not rank, Google has one of the most sophisticated crawlers and likely can read through it, but trying to tweek anything will prove challenging without the help of many of the SEO tools out there. I will also add, "free sub-domain" sets off an alarm for me. In general sub-domains are inherently free. Wix may mean that you may use their tools to design the domain, but it just sounds funny. Kinda like saying free water with every car wash. Anyway hope that answers your questions and gives you some help, Don
Intermediate & Advanced SEO | | donford0 -
Ecommerce question - overoptimisation
HI Don I like you answer and I was wondering if you could give me some advice also? I am doing the SEO for a Shopify site www.neweyeco.net. They want to be found for prescription sports glasses and other key words related to that such as prescription sport glasses. But the other keywords we want to get a ranking for are keywords related to specific sports under "collections" are eg cycling glasses, golf glasses, fishing sunglasses etc - not necessarily prescription. And also under features they want to rank for "blue mirror sunglasses, clear glasses, yellow lens sunglasses etc How would you recommend they be set up to avoid cannibalization or over optimization as I have run into issues with the Alexa rank zooming up to 16.8mill from 500K in January!! Any advice would be very welcome. Kind regards Sarah
On-Page / Site Optimization | | Skemazer1 -
AngularJS - What To Consider?
While developing that new website with angularjs you need to keep in mind SEO and what crawlers do. Put it in a simple way, you should always render the page server side at first load in a crawler friendly way, and use angular client side for UX. To use AJAX to load content is bad for seo no matter what js framework/library you use, from old jquery to angular or react it's always the same story, don't load SEO valuable content through ajax, or crawlers will likely totally ignore it. So, secure you are giving visitors and crawlers valuable SEO content right away on first load, server side, and use angular to manipulate it according to user actions. And while doing that, don't be tempted to cloak, or google axe will hit you. I have been developing websites using angularjs for years now, and always got great SEO results. Good luck.
Intermediate & Advanced SEO | | max.favilli0 -
Moz Page Analysis Country different to Who.is?
Hi David, I noticed the same issue, the domain hosted looks like its in the UK but in WHO.IS it is in Ireland. Where do MOZ scrape this information from and does it affect SEO? Many thanks, Jayne
Other Research Tools | | NCOREGSEO0