Having too many 301 redirects in a chain can have a non-positive impact. I.e. Don't 301 a page to another page that 301s to another page that 301s to another page, etc. etc. etc. Google once stated they could do 5 pages in a 301 chain before giving up. But honestly, why would you choose to redirect to a redirecting page when you could point it at something much more relevant? But as for having a bulk of 301s, I wouldn't worry. If you had 300 different pages that were all being redirected to 300 other pages, google would not devalue you for it. If your redirects are relevant and are good for the user experience, then you're fine.
Posts made by MikeRoberts
-
RE: 301 redirects
-
RE: What is the radius for local search results
For a local shop that has their Google My Business set to indicate they serve customers at their location (as opposed to a delivery radius or areas served), Google will base the businesses shown in a search on either the Location the user specifies or will base them on what they know about the given businesses of the area in relation to the query plus any geolocation information of the user in question. So there isn't exactly a radius bubble that you would need to fall into for those specific kinds of situations.
Now, for an industry like landscaping, cleaning companies, food delivery, emergency auto repair, tow trucks, etc. They can set a radius they serve within or they can set specific areas. So a locksmith might set a handful of postal codes as the regions they will drive to in order to fix your locks. While a Pizza Delivery Service might choose to set a radius of 25km for their service area because they might not be able to reliably deliver outside that.
All of these things can be set up in their Google My Business account.
I know from personal experience that Google will show me things easily 100 miles away from my location if there is nothing in between that fits my search.
-
RE: Affiliate links and parameters creating duplicate page titles
How long ago did you implement canonicals to fix the issue? Its important to remember that a lot of SEO isn't about quick fixes. Some of these things will take weeks/months to completely filter through and for you to see the outcomes you were looking for.
You could "NoIndex, Follow" the pages if you really wanted to but I'd suggest waiting a bit longer for the canonicals to start working. If the canonicals really don't seem to be working for some/all of those pages, and you don't want to give it more time after a few weeks, then you could reasonably "NoIndex, Follow" the pages... but you could be harming link equity that would have been attributed to the canon page.
As for the 301 idea, it would really depend upon what the filtering parameters are doing. If its filtering out sections of a page for users or its helping you determining lead attribution in analytics, then I wouldn't want to indiscriminately 301 them. If they're being autogenerated but have canonicals in them from the start then you should be fine. Like above though, if after a few weeks of them existing with canonicals Google still is throwing duplicate errors at you for those specific pages then you can NonIndex them if you feel it is necessary.
-
RE: Affiliate links and parameters creating duplicate page titles
Its important to remember that a Canonical is a suggestion not a directive. So making those canonicals is the right way to do it but the search engines and crawlers will determine at their leisure whether they believe the canonical is right or not. Sometimes its a quick fix, and some times they don't accept the canonical at all. If the pages are exact duplicates then Google will get the picture eventually and start recognizing it. As for the Moz crawler, I haven't had a paid account in a while so I can't speak to the way the system works currently but I do know that it used to be an issue where the Moz crawler would never seem to properly recognize a canonicalized duplicate... so that may still be the issue,
-
RE: Handling redirects when 2 companies merge
Based on my limited experience of this type of situation, I've felt a good start is to Canonical each page on the old domain to its counterpart on the new domain and place a call to action on the old sites letting people know of the move and/or branding change. Submit for crawl request, let the bots begin filtering through the changes without affecting user experience yet. Then after some time, so your regulars have become acclimated to the upcoming changes and so have the bots, you can 301 those pages from the old site to the relevant counterpart they were already canonicalized to.
-
RE: Not Ranking - Any Tips?
Its only been a month and its a moderately competitive landscape. So its possible it will take some time for all the changes to fully filter through and start ranking you better. Have you done a crawl request on the site, are your pages all indexed, is everything redirected properly that needs to be redirected, is your robots.txt set up properly, and have you seen any growth in important metrics in analytics since the changes were made that might signal to you the changes are starting to work?
-
RE: Do you need contact details (NAP) on every page of your website for local search ranking ?
If there's no footer, why not at the top of the page. Something along the lines of "Located at the intersection of street and road in the center of Town" with a nice, obvious Click to Call?
-
RE: Do Search Engines Try To Follow Phone Number Links
While a bot might try to follow it (because it is, in it simplest form, a type of link), that will not in any way adversely affect you. That tel: in the tag will tip them off that it is a telephone number and/or should be click-to-call. So no link equity will be lost, you won't start seeing tons of 404 warnings, or anything of that sort.
-
RE: Impact of Non SEO Subdomains
It is _possible _that a subdomain-based landing page could be cannibalizing rankings for specific terms from another page on your site. But if that landing page is actually that good for the term, its not necessarily a bad thing to have it ranking. If its ranking better than the pages you optimized for organic then maybe you should look at why that is (i.e. is it getting good/better links, are people sharing it around, is it better targeted than the organic page, is it more intuitive or has a better call to action, etc. etc.).
Now, if you really don't want thoe pages to rank in organic in place of optimized pages you created then you can very easily add a NoIndex tag to the page or exclude it from being crawled in Robots.txt
-
RE: Menu Structure
You could always test the link to see if it is really being used from the secondary navigation more so (or at all) than the main navigation link. Create a parameter and track it over a few months in analytics. That way you don't over-optimize in the interim but 3 months from now (or less, or more, really that's up to you) you can definitively say whether it is better to remove it or if leaving it alone was the correct move.
-
RE: Menu Structure
I can't really visualize that well how the links are laid out from your description. Its been a long day, so that might be it.
So it really depends on how it links back to those pages. e.g. If they're site navigation breadcrumbs then I don't see the problem as it potentially establishes relevancy for the topic and facilitates movement through the site for both users and bots. If its just an internal link for the sake of a link in the body of the page, maybe not so much. But if its a completely relevant link and you see in analytics that people who enter on the one page are regularly going to the other, and vice versa, then obviously it is of use to the customer/visitor. If its an issue of pages being link heavy and you're worried that its diluting link equity or creating a user experience issue and you want to clean up the page and/or make it easier/more intuitive to use, then a heat map tool like Crazy Egg might be useful for helping you determine which links to keep and which are flak.
-
RE: When is Too Many Categories Too Many on a eCommerce site?
Are the categories helpful for the customer? On one hand you don't want to lump too many things into one category when they can be broken out into more granular categories that better serve visitors. On the other hand, it won't help you or your customers if you get too granular and break everything out into categories based on the mot insignificant details.
While keyword cannibalization is a concern, serving your visitors/customers what they want and how they prefer to see it will likely improve metrics more on your site than concerning yourself with a nebulous concept like "how many categories is too many." If you have 200 different categories but they are well targeted and you want to add another (or ten more) that are also equally well targeted, then why wouldn't you do it?
-
RE: Viewing search results for 'We possibly have internal links that link to 404 pages. What is the most efficient way to check our sites internal links?
Do you have Google Webmaster Tools/Search Console set up for you site? They'll let you know through that tool when Google notices a 404 on your site. Alternatively you could download a tool like Screaming Frog and run a crawl of your own site to see what 404s it finds.
-
RE: We 410'ed URLs to decrease URLs submitted and increase crawl rate, but dynamically generated sub URLs from pagination are showing as 404s. Should we 410 these sub URLs?
You could but its not completely necessary to go through all those sub-pages to 410 them. While a 410 Gone response is a stronger signal, those pages serving 404s will eventually be removed from the crawl and/or SERPs by the bots anyway. So if those pages are just dynamically-generated flak, and don't provide anything of benefit, then leave them as 404s and don't worry about it.
-
RE: Please let me know if I am in a right direction with fixing rel="canonical" issue?
As Logan said, you'd be better served handling these with 301 redirects. But you will also want to go in Google Search Console/Webmaster tools into Site Settings and set your preferred domain to either WWW on Non-WWW (depending on which you prefer to show across your site).
-
RE: 404 broken URLs coming up in Google
Agreed. Go to Search Console, see what 404 errors Google is throwing your way, 301 redirect anything that can & should be redirected from the list to their most relevant equivalent on the live site, and then fetch & submit the site for a recrawl.
OR (since the links in question you posted was for a Test Site) if that test version needs to be up for internal testing purposes then you can potentially NoIndex the pages, resubmit for crawl so the bots see the NoIndex on the pages, and then after they've dropped out of the SERPs you can update your robots.txt to disallow the folder those pages are sitting on. (Not sure if there's a better/quicker way to get them out of the SERPs if you still need it Live)
-
RE: Google Indexing Desktop & Mobile Versions
You can easily restrict portions with robots.txt depending on how exactly your site is set up. So for instance, something like:
Desktop site: http://www.domain.com/robots.txt
User-agent: Googlebot
Allow: /User-agent: Googlebot-Mobile
Disallow: /Mobile site:http://m.domain.com/robots.txt
User-agent: Googlebot
Disallow: /User-agent: Googlebot-Mobile
Allow: /Or
User-agent: *
Allow: /
User-agent: Googlebot
Disallow: /mobile/
Allow: /
User-agent: bingbot
Disallow: /mobile/
Allow: /
User-agent: Googlebot-Mobile
Disallow: /
Allow: /mobile/
User-agent: bingbot-mobile
Disallow: /
-
RE: Canoncial tag for Similar Product Descriptions on Woocommerce
I agree with Laura on this one. If the content of each page is 99% the same as each other (and/or 99% the same as what all your competitors are doing) then you're not going to rank and be found for these products; especially if there is an older, more established brand in your industry. Your best option is really to fill out those pages with more unique content. It can be daunting but you can get them to rank and be found with just a little bit of work. (Trust me on this, I used to work for an ecommerce that had a few hundred products [each with 7-12 micro-variations] that were legitimately the same thing as each other but with a slight color or texture difference at best... you'd be amazed how many ways there are to sell the same thing without duplicating copy.)
Throw together a landscape report, get an idea of all the various core terms in your industry, lay out a plan for what pages will use what term(s) and how, and if you don't have an in-house content writer it wouldn't hurt to look into hiring one (even part time) to get 89+ pages banged out for your site.
-
RE: Number of searches for specific keywords
There is no real way to find the exact actual number but, as Tymen said, the Adwords Keyword Planner tool is one of the better ways to determine search volume that can be broken down by country, state, province, county, etc. to get a better understanding of the available landscape for your pages. There are also tools like SEMRush and Spyfu that are able to pull search volume.
If you're using more than one tool, you'll notice a disparity in the numbers on many occasions. Unfortunately there's no perfect way of finding out with 100% accuracy what the actual search volume number is so some tools pull their information from one set of sources while another pulls info using some differing sources. Both are technically correct but should be taken as estimates of potential volume.
The Moz Keyword Difficulty Tool also pulls some search volume data.
-
RE: 4000 new duplicate products on our ecommerce site, potential impact?
Is the 4000 product site still going to exist or is it being stripped and moved to the 9500 product site? If everything is getting completely moved from one site to the other then you really do need to find out who has access to canonicals or 301 redirects so you can move the sites properly. If the smaller site is staying up and selling those products still, realize you'd be canibalizing your own traffic potentially and could wind up with shoddy rankings from all the scraped/dupe content.
Since you have no access to Canonical/NoIndex/Robots/etc. the question is, what do you have access to? Do you need to move all these products over? Are they exact duplicate of things you have on your site already? If its an exact duplicate of something you offer then you probably shouldn't add a duplicate page but you should canonical or 301 if you were able to. If they're close but have slight differences then you might be better served by adding a new product option to the existing page for the similar product in order to better serve the consumer, instead of diluting rankings with something so similar. Though you till might need that canonical or redirect to ensure everything is targeted properly.