And don't forget to set your holiday hours. 
Posts made by BeanstalkIM
-
RE: Wordpress pagination and SEO
Hello SirMax.
Removing the "/page/" is easy enough with Mod Rewriting or I'm sure some plugin or another but the issue will have more to do with the weight the post is being given by the engines than the URL as the final URL is the same. To illustrate, there's an article on Search Engine Watch at http://searchenginewatch.com/sew/how-to/2179376/internal-linking-promote-keyword-clusters that outlines how the math of internal links work (simplified). Essentially what's going on is that your homepage is the strongest page (at least I assume you're part of the 99.99% of sites like that) and even in a blog, the blog homepage will be the strongest page within the blog, so pages it links to get the most weight. When one of your posts moves onto page 2 it's not the URL that causes the drop it's that the link to that post is now on a weaker page.
The related posts plugin is a solid idea but won't fix the specific issue you're having (though it will help in other ways). What you probably want is something like this plugin that displays the most popular posts. What you can do is place a widget on your homepage and set the posts to a reasonable number (10 or 20 max) and it'll display always your most popular posts on the homepage (read: blog homepage) helping insure their internal link weight is maintained. This will weaken slightly the actual post links as they stand today by adding more links to the page but it should help you address the issue you're having.
One concern to be aware of here is that you do need to review these pages. Some posts are popular simply by age and won't give the highest value for current traffic. You'll want to occasionally review which pages are appearing and if there are posts with low current traffic that are blocking other posts (those just below the post number threshold you set in the plugin) then you could be losing traffic so you'll want to exclude them.
Hope that helps.

-
RE: Is your live site supposed to have rel canonical tags?
I'd be interested in hearing what someone else has to say about the way the canonicals are coded. You're doing yours similar to the way I do DNS Prefetching with the double slash to start the URL:
It works fine with prefetching as all the browser needs to do is find the IP of the domain but I'm not sure here how it'll handle sub-directories including www and I hate variables even when they're "it should work". The more common way to canonicalize your secured page would be:
/>
I'd be interested to hear if anyone has any direct experience with this but at the core of technical SEO issues I always lean to "most common usage" and "how Google shows it in their examples" just to make sure there is minimal chance of hiccups or issues.
That aside though, the developer is right though I'd always still prefer to just see the pages at a single URL. Since that can't be done however ... canonicals are the way to go.

-
RE: Is your live site supposed to have rel canonical tags?
I'm not sure I entirely understand the scenario so let me note how I'm hearing it to make sure my understanding is correct to put the answer into context. Please do let me know if my understanding of the scenario is wrong as that may well change my thoughts on it.
You note that your secure site and live site are creating duplicate content. Of course a secure site can be live but I'm taking this to mean you have an area behind a login. That it's creating duplicate content is making me think that a lot of the core information is the same and I'm guessing many of the same pages.
If this is all correct and you can't put the duplicated pages onto one URL only then the canonicals are the way to go and your developer is correct.
-
RE: Using same business number on different websites
SEO aside ... how would you know how to answer the phone?
-
RE: Can you use Screaming Frog to find all instances of relative or absolute linking?
Depending on how you've coded everything you could try to setup a Custom Search under Configuration. This will scan the HTML of the page so if the coding was consistent you could put something like href="http://www.yourdomain.com" as the string it's looking for and in the Custom tab on the resulting pages it'll show you all the ones that match the string.
That's the only way I can think of to get Screaming Frog to pull it but looking forward to anyone else's thoughts.

-
RE: Is there a purpose to the "google my business" description?
Always a good rule of thumb to follow Josh's advice here. If Google gives you an opportunity to give them more info about your business or site ... drink the Kool-Aid and do it.

-
RE: Substantial drop in organic traffic and keyword rankings
I'm always hesitant to switch domains and you do have some good links in there that would be a shame to lose. If you have a manual penalty (which i assume you don't) then it might be something to consider (though still not my favorite path) but with just an algorithmic kick in the butt I'd audit the links, disavow the low quality ones and build up from there.
As a warning I'm sure you're aware of, when you disavow the bad links you'll likely be disavowing links that are actually passing weight currently. The site might see a small drop in rankings after but that's better than a massive drop with the next Penguin update.
-
RE: Substantial drop in organic traffic and keyword rankings
Ah, I thought it might relate to links. Did your client let them go in the spring of 2014?
-
RE: Substantial drop in organic traffic and keyword rankings
As Josh will find out, it'll be hard to post a public response as it would likely require revealing more than I'd feel comfortable doing but interested to know if Josh and I ended up with the same general take on the potential cause.

-
RE: Substantial drop in organic traffic and keyword rankings
Hey Karl, Dave here.
As I'm sure you can imagine, it's pretty tough to get a feel for the issue without more info. If you'd like to PM me the URL I'm happy to take a quick look. I may respond back here to help others but I won't list off any details that could be considered proprietary or confidential.
Cheers !
Dave
-
RE: Non-Existent Parent Pages SEO Impact
Provided that there are no links to http://clientname.com/products/ and obviously it's not part of the breadcrumbs you're fine though you may want to consider adding these pages to give you a point to all products (of a list of categories of products), etc.
But no ... the structure itself isn't inherently going to cause problems.
-
RE: How To increase page authority and domain authority
Might be my favorite answer of all time.

-
RE: New Site (redesign) Launched Without 301 Redirects to New Pages - Too Late to Add Redirects?
Personally I'd start with a link analysis to answer the question, "Are they stronger than you?" You'll want to look at their sheer volume as well as the quality and when they were built to get a feel for their current activities. After that I'd obviously look at your content. Does your content comply with current SEO best practices in it's type and formatting and down to the technological questions such as "Do you have clean and fast code?" and "Are you formatting properly?"
If you're looking for assistance in the process Moz actually offers a list of their recommended SEOs. It's a good list.
You'll find it at https://moz.com/community/recommended. -
RE: Geographic site clones and duplicate content penalties
I do agree that by the guidelines taken verbatim you could make a good case. My concern is that it's not some guy at Google sitting down and judging sites and asking, "Does this violate the guidelines?" it's a bot and as I'm sure everyone here can attest ... Pandas and Penguin aren't perfect. One can just ask Barry Schwartz of the very credible SE Roundtable about getting hit with a Panda false positive on content issues and about the cost in traffic it causes. Or you can read his post on it here.
Or maybe I'm just paranoid. That could well be.

-
RE: Geographic site clones and duplicate content penalties
There are always exception to rules but for safety I would highly recommend blocking the .com site until you can get some real unique content on it. It does stand a high chance of taking it's own devaluation (almost certain) and may impact the .co.uk site (and really ... why risk it).
If the scenario was mind I'd have simply built in customized pricing and other relevant information based on IP but if that's not your area (and fair enough as that can get a bit complicated) then the redirection you're doing now to just get them to the right site is the logical option. I'd just block the .com in your robots and put the noindex,nofollow meta in there for good measure and start working on some good unique content and if you won't have time for that - just enjoy your UK rankings.

-
RE: New Site (redesign) Launched Without 301 Redirects to New Pages - Too Late to Add Redirects?
_When is the right time to just start over with a new domain name? _ Hindsight being the 20/20 that it is it's very hard to know until it's too late. I always suggest to try to work with your current site as it's generally easier to repair that to replace (generally ... not always).
The variable at play now is that after two years your site may have recovered BUT not be ranking as the competition may well have upped their game or other algorithmic factors may be at play. I've seen that a number of times where sites don't bounce back not because they didn't do the right thing but because while they were busy repairing their issues, their competition was busy moving their sites forward.
To know what to do I'd start with a round of competitor analysis. Don't compare your rankings with where they were but rather compare what your site's strength is relative to the people ranking today. And of course, try not to think of your content or links as better simply because you like it - try to look at it all as a bot would.