Dmitrii isn't wrong, but I feel it's always important to determine which version is best for your branding, marketing materials, and which is most memorable/useful for your customers and leads. Some sites do better with "www", and some without. Don't leave it up to chance for Google to choose one for you.
Posts made by Lumina
-
RE: Preferred domain is it importnant you choose one?
-
RE: Hi Moz community! Why are the number of high priority issues (404s) significantly different between MoZ and Google search console?
Tim's correct, but I'll add a couple of things:
- Never take any analytics platform as gospel - meaning, each of them have their inaccuracies. This is one of the benefits to having multiple platforms running analytics on your site(s).
- In this case, I would recommend following Google's count. Not only because it's higher, but because their analytics are going to be more accurate (generally) to what's happening in SERPs. When I say the volume of page issues is a factor, what I'm considering is that either Google is simply seeing more errors than Moz, or Google's crawl is itself in error (which happens on occasion). Either way, that would need to be resolved, so solving the issues Google's provided should be paramount.
- Lastly, the Yoast plugin is wonderful in many ways. Don't take everything it gives you at face value, but more often than not, it'll be a fantastic resource for on-page SEO. There's also a plugin simply called "Redirection" that can setup 301s from a page that's 404ed to a more relevant page, should you wish to go that path. Either way, always make sure you have either a dedicated 404 page (with actionable content so users aren't lost), or some manner of populate-able content.
Good luck!
-
RE: Old site penalised, we moved: Shall we cut loose from the old site. It's curently 301 to new site.
Without seeing the data in detail, I'd have to give a pretty general answer: If your old site has a poor DA, been penalized, or is itself spammy, then having it redirect to a new site is only one step better than just staying on the old site. When you redirect, you're passing link-juice from website A to website B; both bad and good. It's the same reason why you disavow bad links in the first place, though in this case, the redirect is at least under your control.
Anything that points to your new site factors in its overall stats. It sounds like your old site is hindering your attempt to break away from the bad links. My recommendation would be to cut the redirect, and only reinstall the redirect when you've cleaned up website A.
-
RE: Is there any BAD SEO implications to acquiring backlinks from customers?
There aren't really any negative SEO implications to having a client's site backlink to yours. The only way there would be is if their site is spammy or something to that effect. This is true both for having a link to their site in yours, and vice versa (which I only bring up because of your last question).
There's also nothing wrong with offering a discount on the next purchase that customer makes. However, I've found that most customers are OK with doing this without having to do that. I've actually set it up so that, in our contracts, we have a clause that says we're allowed to, at our own discretion, add our logo and a link back to our site in the footer of theirs (with certain reasonable restrictions, etc).
Either method works. Asking how they feel about your service is an equally valid method of achieving this. You're good to go!
-
RE: Unique Pages with Thin Content vs. One Page with Lots of Content
Not to sound cheeky, but neither is preferable. The solution with lots of thin content means you have a lot of pages that hold little to no use to a user, while also putting you at risk for, well, thin content. The solution that keeps all of this content on one page means that your page is poorly targeted (especially bad since this is a service page), that the content will be difficult to digest for users, and that you have no real options for navigation to a specific service.
Since it's clear you know you're in a "lesser of two evils" situation; my recommendation would be to go with the approach of having many pages. You can at least target these pages better, and it leaves you open to adding more content later on. However, this option is a bigger risk : bigger reward. You might get penalized for thin content. But if time is the absolute in this circumstance, I think that relatively thin pages may be more manageable.
As an amendment, might I recommend adding a couple of universal content sections? Meaning, within these thin pages, along with the 100~ words of good, unique content, having a content section that's the same throughout each, so that you have more content on these pages without sacrificing as much time. Something like links to relevant blog articles, customer testimonials, or a personable explanation of the company's methodology regarding the delivery of services.
-
RE: Shortened page titles and changed urls to match, will this effect my page rankings?
Also, for the future, I recommend using two plugins for WordPress:
- Redirection by John Godley: This plugin lets you set up redirects within the WP dashboard, which means you can keep track of what's being redirected where within your WP administration.
- Yoast SEO by Team Yoast: This plugin lets you add and edit a lot of different SEO-related items, gives you tips regarding keywords you want to focus on, and a lot more. It's incredibly useful.
To be clear, I don't work for or with either of these developers/teams, but they're great plugins that sound as though they'd be especially helpful for you.

-
RE: Shortened page titles and changed urls to match, will this effect my page rankings?
If I understood what you're saying, it seems as though you may have mistaken page title and URL for the same thing. You can alter a page title without affecting the URL and vice versa. For example, in Google SERPs (Search Engine Results Pages), the blue text shown for each result is the page title, while the URL is in green text.
Now, with that said, page titles shouldn't exceed about 150 characters. This is because after this point, the title will be truncated and the entire title will, therefore, not be shown (only part of it will).
Regarding switching the URLs back to their originals, I'd recommend simply changing them back to what they were rather than redirecting them. Either way, it sounds like this whole endeavor has hurt your SEO, and either method may have implications. When you do this, any URL that is not hardcoded should be fixed by WordPress automatically. What this means is, while links populated internally (ex: in menus, navigation, footer) should be changed by WordPress without you having to do anything - though I'd strongly suggest you double-check these manually. However, the hardcoded links (ex: links in body text, images, some widgets) might not be resolved by WordPress automatically. These you will probably need to tend to on your own. Meaning, if in the second paragraph of blog article A, you link to blog article B: you'll probably need to edit the page for blog article A and replace the old URL with the new.
If you need any clarifications or further help, please ask! And sorry if I mistook anything you'd said.
-
RE: PDF Instructions come up in Crawl report as Duplicate Content
Yes, you absolutely should add unique text to each of these pages. Not only so that they aren't flagged as duplicate, but because it's always an SEO benefit to have more good content. If you don't have the capacity to write such content, however, you may want to remove them from indexation.
The reason that these pages are being flagged as duplicates is that Google isn't parsing these PDFs. Which means that, all Google and others see are pages with no content and an iframe. It's also pertinent to note that Moz will flag anything with more than 90% overlap as a duplication.
I hope this helps!
-
RE: Need help fixing the duplicate content that keeps growing
That's still not quite enough to go on. Could you provide the message they're giving you, and/or URLs regarding the duplicate content? Examples in either should prove helpful.
-
RE: Need help fixing the duplicate content that keeps growing
You'll have to be a lot more specific. Try answering at least some of these questions so we can help you:
- What content is being duplicated?
- Where are the duplicates?
- Is this all internal (on your site only)?
- Are you receiving any duplicate content warnings from Moz, Google, etc.?
- In what way does it "keep[s] growing?"
- What kind of content is this?
Once you provide answers to some of these questions, I'm sure we'll be able to help you fix the issue.
-
RE: Anyone know how to filter by location-neutral results?
I always run my searches incognito in Chrome, **private **in Firefox, and in-private with Internet Explorer. These settings remove any personal information, login data, stored cookies, and location data.

-
RE: Why is moz telling me I have duplicate content, but neither the content nor the urls are duplicates?
It's showing as duplicate content because aside from an image and caption, there's no content on this page. Low-to-zero content pages that follow the same template / structure will often be seen as duplicate content. Sorry to say, but these pages should either have content added to them, or deindexed. The latter is much easier, of course, and both are almost guaranteed to improve your site. But adding content would be the better solution. It'll add value to those pages, you won't be removing the pages from indexation, and it can incentivize engagement.
-
RE: URL Question: Is there any value for ecomm sites in having a reverse "breadcrumb" in the URL?
I don't see how that would even function. What comes after the root of a domain tends to be levels of folders and sub-folders, in cases where there are levels. So, example.com/hats/red-hats/big-red-hats makes sense because big red hats would be within the folder red hats, and red hats would be in the folder hats. Having your urls reverse this would not only not make sense to users, it wouldn't make sense to search engines.
If your thinking was that the closer to the domain a specific keyword is, the better ranking you can expect - think again. For any website - but especially for an eCommerce site - the structure of folders and subs is very important, and helps both users and search engines find what they're looking for and parse your site.
Bottom line: I strongly advise avoiding a strategy like what you're suggesting. I can't foresee it having anything but a negative influence.
-
RE: Link to Moz Pro home 404s, and then I stumbled upon Moz's staging site...
It sure did. I'll email help@moz.com and provide more insight. Thanks!
-
Link to Moz Pro home 404s, and then I stumbled upon Moz's staging site...
I'm wary of giving more information as it's probably best that little to no one be able to access Moz staging for one reason or another. I should also mention that this was done very unintentionally on my part. That said, from the 404 from Moz Pro home, I tried to access "My QA", and that's when I realized that everything I did thereafter was within staging.moz.com (and posted a question similar to this one).
Does Moz permit access to their staging site, or did I stumble upon a mistake - or Moz-stake, if you will?
-
RE: Google Search Analytics How to Get Search Keywords for a Page?
Hmmm. I'm not sure what the issue may be, but when I do this, I'm getting the information you're asking for by doing this on my property. Hopefully someone else can cue you in.
-
RE: Google Search Analytics How to Get Search Keywords for a Page?
While in GSC, select the property you want to check out. Then, select Search Analytics. Within there, at the top of the graph, you'll see "Queries", "Pages", etc. Choose the page you want to see the keywords for by clicking the arrow by "Pages" and filtering by that page. Then, viola!
As for your second question, I doubt it. GSC replaces GWT. You'll have to get used to the new.

-
RE: Same H1 & H2 Tags
While it won't necessarily have SEO implications, I'd suggest steering away from duplicate headings, if only because it can be redundant for anyone consuming your content. It could be argued that, this sort of duplication could reduce your ability to diversify your keyword usage.
-
RE: Can a page be 100% topically relevant to a search query?
100% saturation is impossible to achieve while maintaining any semblance of value. Not only because any proper page inherently has navigation, internal linkage, and myriad other elements, but because to write content about a subject in that amount of detail, one would invariably need to write about sub-topics and related topics. It's just not feasible. But, and here's the kicker, you wouldn't want 100% saturation anyway.
Rich, dynamic content incorporates that which is related to it. Strong pages link out to others, and keep visitors within their media cycle, if not churning them lower down. Good content is content that holds information that's both detailed and general to a topic. I would say, at most, the highest saturation point that still remains within strong SEO and content optimization is about 85-90% when taking into account all page content - and even that's pushing it, really.
-
RE: Tough 301 redirect with a /www in it
I'm unfamiliar with the plugin, but usually when a full url is shown after the base as a subpage or subfolder, it means that there was a redirect setup incorrectly. Meaning, that isn't a url that you can't redirect, it's a url that's already being redirected, and incorrectly looping back on itself. Check the plugin to see if that's the case, and if not there, check the htaccess as well as the page source itself.