When you set up the 301 redirect rule that sends HTTP requests to HTTPS, Google will notice that. Leave your XML sitemap the way it is (with HTTP URL references) for 30 days. This will give them sufficient time to crawl your XML sitemap and learn your new protocol as they hit the redirects. Once most of your indexed pages have switched to HTTPS, you can update your XML to include the secure URLs.
Posts made by LoganRay
-
RE: Https & http
-
RE: Https & http
Rolling back to HTTP for non-checkout pages is an option as well. The main point I was trying to make was to not have both versions of your URLs accessible/indexable.
-
RE: SEO for Product Pages Deal that will last One Day Only
Hi Raul,
'Deal of the day' ecommerce models rely heavily on email, social, and paid search, and not so much SEO. SEO is a slow play, so I wouldn't put too much energy into planning SEO efforts around this.
That being said, a noindex tag is going to be sufficient in preventing the deal of the day section of the site from hurting other aspects of your site (if there are any), you don't need to worry about putting a canonical tag on that product detail page since you're not indexing it.
Hope that helps!
-
RE: Https & http
Hi,
Both versions HTTP and HTTPS of your site will render, that's a problem. Since you've got an SSL and it's been applied to the home page, you should make your entire site secure. Once you've done that, you'll want to apply a redirect rule that sends all HTTP requests to the HTTPS version. Because you're not currently doing that, you're running the risk of duplicate content issues. Once you've done that, yes, you should set the primary domain in Google Search Console (WMT) as HTTPS. There's a few other steps you'll want to take as well - Cryus Shepard wrote a great post detailing all necessary steps for secure migration, I highly recommend reading that.
Additionally, when people on your site are bouncing back and forth between HTTP and HTTPS, it's destroying your data integrity in Google Analytics. Going from a HTTP page to a HTTPS page breaks the session, and starts a new one that will be attributed to direct traffic. You can see how this would quickly become a nightmare for accurate analysis and measurement. If you follow the steps in Cyrus' post, your GA data should return to normal because users won't be going back and forth from secure to non-secure.
-
RE: Taxonomy question - best approach for site structure
Honestly, search engines aren't that particular about URL structure, it is important, but not to the degree where one of these two examples is going to make or break your SEO campaign. That being said, I usually set up my URLs with the broadest category in the first folder, and get more granular from there. In your first example, the assessment and treatment folders make more sense to me, since there's additional content that could live in each of those respective folders. In your second example, there's less opportunity for future content to live in those folders.
-
RE: URL has caps, but canonical does not. Now what?
I've had some run-ins with case-sensitive URLs in the past and it drives me crazy, I don't understand why CMSs still do that!! While canonical tags are a perfectly fine way to handle this, there's a better solution. Brian Love wrote a great blog post on how to do server-side URL lower-casing. I've used this on a few sites and it works great.
-
RE: Should a login page for a payroll / timekeeping comp[any be no follow for robots.txt?
I'd recommend noindexing that page, that's the only sure-fire way to keep it out of the index. Allow it to be crawled with your robots.txt though, or search engines won't be able to find your noindex tag.
-
RE: Does Google penalise in the way described in this article?
The concept is sound, but their delivery is misleading. Keyword cannibalization is definitely a real thing, you should always avoid targeting the same keyword or topic across multiple pages on the site.
-
RE: Any downside to a whole bunch of 301s?
301 redirects do have a significant impact on pagespeed on mobile devices since they are often connected to much less reliable networks. Varvy has a great article with more details: https://varvy.com/mobile/mobile-redirects.html
If Google has already reindexed all of your new URLs, then you don't need to worry about covering every single one of your old URLs - stick with the ones the had links pointing to them.
A good way to measure how many of your 301 redirects are being used is to append query parameters to the end of the resolving URL (ex. below) where you set the src parameter to the referring URL. This gives you some unique identifiers to apply filters to in your landing page report in Google Analytics.
/old-page >> /new-page?redir=301&src=/old-page
-
RE: What happened to YouMoz?
Christy,
I still don't see anywhere to submit posts, but I have noticed a lot of the more recent posts are from non-Mozzers. What's their secret?!?
-
RE: Alternatives 301? Issues redirection of index.html page with Adobe Business Catalyst
HI Anna,
Generally speaking, Google is pretty good at indexing only the / version of a homepage. But if you're having problems with them indexing both, you can use a canonical tag on the homepage to solve this. For more information on this, check out Moz's guide to canonicals.
From a traffic measurement perspective, you can configure Google Analytics to associate metrics from both URLs with only the / version. Simply go to your filters and create a new custom filter. Select 'search and replace' and set search string as /index.html and replace string as /.
Hope that's helpful!
-
RE: Any downside to a whole bunch of 301s?
I generally try to keep redirect lists for my clients under 100. You mentioned you had some links to 404 pages, I'd focus on those and add others as you see fit based on traffic volume to those old pages. I've never actually tested the threshold at which site speed starts to become a problem, I see some experimenting in my future!
-
RE: Any downside to a whole bunch of 301s?
Hi,
You should keep your 301s to a minimum. Every time a URL is requested, the server checks every single redirect you have to see if there's a match. The larger your redirect list gets, the more impact it'll have on site speed.
-
Thoughts on RankScience?
I'm sure most of you have heard about this startup, RankScience, that has big ambitions to disrupt the SEO industry with their automated (I know I know...the word 'automated' and 'SEO' in the same sentence!!!) optimization software. Their claim is that by running thousands of congruent A/B tests on your site, they can maximize rankings and organic traffic.
Initially my thoughts were "oh crap, there goes my (and a lot of other people's) career". But then I started thinking about it a bit more and realized a couple things. First, software can't replace a face-to-face client meeting. Being in an agency world as most of us are, client interactions are vital to a sustained partnership. Second, someone is going to have to understand what this software does, configure it, and monitor it, and I'm ok with that being part of my job if that's how the industry shifts. Third, and most importantly, in theory this software has the capability to reverse engineer search algorithms. If they had the data of 10,000 websites using their platform and are collecting data on what works and what doesn't, it's only a matter of time before they can pick apart the algorithm piece by piece to figure out exactly how it works. Google is obviously not going to like that very much and will almost certainly right the ship.
That's my 2 cents, looking forward to what your thoughts are on RankScience and the future of our industry.
-
RE: Is AMP works on blogs only?
Hi Stephanie,
If you're using the same WP plugin I have installed on my personal site, then yes, that is for blog posts only. I've been waiting for them to push an update that includes other content types, but nothing yet. AMP can be applied to any page on your site, just not as easily as blog content.
Regarding Search Console, it takes a while for them to index. Go into the source code of the non-AMP version of one of your posts and make sure there's a rel="amphtml" tag that points to the AMP version of that same URL. Without that, it's basically impossible for search engine bots to discover your AMP content.
Hope that's helpful, good luck!
-
RE: Redirect 'keyword-url' to improve ranking?
Hi Jan-Peter Boer,
10 years ago, this tactic would have probably helped your situation. Modern algorithms will all but ignore it and will provide you with no benefit. The only way it would help is if the keyword-rich domain had a bunch of high quality, topically relevant links pointing to it, which then by way of redirect, would be pointing to your main domain.
-
RE: [Very Urgent] More 100 "/search/adult-site-keywords" Crawl errors under Search Console
Oh yea, I missed that. That's very strange, not sure how to explain that one!
-
RE: [Very Urgent] More 100 "/search/adult-site-keywords" Crawl errors under Search Console
I've seen similar situations, but never in bulk and not with adult sites. Basically what's happening is somehow a domain (or multiple) are linking to your site with inaccurate URLs. When bots crawling those sites find the links pointing to yours, they obviously hit a 404 page which triggers the error in Search Console.
Unfortunately, there's not too much you can do about this, as people (or automated spam programs) can create a link to any site and any time. You could disavow links from those sites, which might help from an SEO perspective, but it won't prevent the errors from showing up in your Crawl Error report.
-
RE: Export Google Search Results
There's no tool that will handle it in that kind of volume. I know of one tool called SERP Scraper that will do 100 URLs, but that's no good when you're evaluating 300k. I'm fairly certain Google makes it impossible for anyone to build this tool, as useful as it would be, it would exist already if this weren't the case.
-
RE: Please take a look at my canonical tag - is it written right?
Unfortunately I'm not much of a coder, so I won't be able to guide you on the htaccess code piece. Regarding the Search Console items though, the tool treats every site that is setup as its own entity, which is why you need a country and XML for each. An example of why they do this because you might have different profiles for http://www.example.com/us and http://www.example.com/ca where the subfolder specifies the country. If they recycled the same info from each profile setup, the /ca site would be set to U.S. instead of Canada.