Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Search Engine Trends

Explore current search engine trends with fellow SEOs.


  • As for URL transfer Dave Nash's answer is perfect, you would get the authority for URLA.com when you 301 redirect it to URLB.com. You also need to consider following process for transferring the authority of existing pages of URLA.com to new domain URLB.com Try to map each URL of existing website(at least the important ones) with 301 redirects to new domain URL equivalent. (example : > About-us page to be mapped with About-us page on new domain ). This will transfer most of the SEO value and authority to the new domain URLs and to the right pages. Register and verify your old domain and new domain with Google Webmaster Tools. Create a custom 404 page for the old domain which suggests visiting new domain. In a development environment, test the redirects from the old domain to the new domain. Ideally, this will be a 1:1 redirect. (http://www.example-old-site.com/...to http://www.example-new-site.com/...) 301 redirect your old domain to your new domain. Submit your old sitemap to Google and Bing. The submission pages are within Google Webmaster Tools and Bing Webmaster Center (This step will make the engines crawl your old URLs, see that they are 301 redirects and change their index accordingly.) Fill out the Change of Address form in Google Webmaster Tools. Create a new sitemap and submit it to the engines. (This will tell them about any new URLs that were not present on the old domain) Wait until Google Webmaster Tools updates and fix any errors it indicates in the Diagnostics section. Monitor search engine results to make sure new domain is being properly indexed. I hope this helps, let me know via your response if you have further questions. Regards, Vijay

    | Vijay-Gaur
    0

  • See how it goes in the next few days Clojo and if it doesn't settle down, perhaps start taking a look at the site in a little more detail. Perhaps the recent update pushed a button on your site for Google and they are reindexing? Hard to tell at this stage sadly. -Andy

    | Andy.Drinkwater
    0

  • My apologies for taking so long to get to this question after you asked. Here are my thoughts. Have you seen this article that I wrote for Moz on Panda and thin content? https://moz.com/blog/have-we-been-wrong-about-panda-all-along I don't actually believe that Google demotes/penalizes eCommerce sites for having thin product pages. I think it's much more complicated than this. Most of the eCommerce sites that I have seen that were hit by Panda were, in my opinion, hit because their sites had very little to offer users to make them rise above the competition. If 10,000 different sites are all selling the same product, which site is Google going to show at the top of the search results? When Panda first came out, people were quick to jump on the "duplicate content" bandwagon. Lots of people were rewriting product descriptions because they felt that they would be penalized for using stock product descriptions. But this is not true. If an eCommerce site is demoted by Panda or by a Quality filter I think it's extremely unlikely for it to see improvement just because the product description is rewritten. Similarly, I don't think that noindexing product pages will make a big difference in the eyes of Panda. Now, if a site has a huge number of urls for each product (i.e. different sizes, colours, options, etc.), it's important to canonicalize those pages. In my opinion, this isn't for Panda reasons though but rather to help optimize your crawl budget and make it easier for Google to understand your site. You don't want Google to spend all of its time crawling 2000 variations of one product and not visiting the rest of your site. So, back to your original question. Should we be noindexing product pages with no or little product description? I don't think there is a black and white answer for this. I would likely start by looking at analytics data to see how user engagement is for these pages. If I'm looking for a particular product, it may not actually need a product description. If your site is one of the few that sells this product and the page itself is useful then it might be ok. Check your analytics...are people spending time on these pages? Are they immediately bouncing off? Are they making purchases after visiting these pages? Or are they mostly pages that nobody ever visits? If that's the case then perhaps they shouldn't be in Google's index. Another thing to look at is whether these product pages are frustrating to users. If you do have some indexed, you can look at data from Google Search Console Search Analytics. See what queries those pages are ranking for. Are those pages likely to answer the user's query? If not, if they are likely to frustrate users then they could be a Panda risk. For example, let's say you have a product page that is ranking relatively well for questions like, "How to choose a [product]", "what sizes does [product] come in?", "[product] user reviews". But, let's say that your particular page that is ranking for these terms doesn't answer any of those questions. It's my opinion that if your product pages are consistently not providing searchers with what they want, then they are at a risk for a Panda demotion and that demotion could be on your site as a whole. I think Google is getting much better at figuring out what sites are most helpful to users. In most cases, rather than deciding on what to index and what to noindex, I think the better spend of time and money would be on finding ways to improve the user experience overall so that your site is by far the better option than your competitors'. It's hard to do that objectively though. You may need to get nonpartial users to visit your site and your competitors' sites and tell you honestly which site they would prefer for research and for purchasing. I've likely skirted your question a little. I don't think the answer is black and white.

    | MarieHaynes
    0

  • That's interesting there are only two Google partners in Charlotte, NC. I would imagine there would be more in a larger city. According to search engine land, the side and bottom ads account for roughly 14.6% of total clicks. Ads on the top position targeting the same keyword etc gets on average 14x higher click-through rates than the same ad on the right side. Since ads appear at the bottom of the page only less than 7.3% of clicks are being impacted by the change. Searchengine land also states that as a result of these changes position 3 got the biggest boost in click through rates. I think the percentage of traffic coming from Google would vary widely depending on strategies used to acquire traffic for any specific website. Bruce Clay talks about some of the effects on organic traffic the removal of the right-side ads had on search traffic. Personally at my agency we have seen similar results in some of the PPC campaigns we run for our clients. We haven't noticed any negative impacts and in some instances have seen an increase in performance.

    | JordanLowry
    0

  • Well if you would hide it from only Google that would be cloaking to a certain extend. But in this case I think you can easily get away with it as it will improve the user experience.

    | Martijn_Scheijbeler
    0

  • it still looks to be rolling out, lots of turbulence still. I have been lucky so far to only be seeing small movements and not the big drops some have seen. Fingers crossed it stays that way.

    | TimHolmes
    0

  • I don't think you are the only one.... it would seem that there has been a core major Google update. See here from SEO Round Table. and it is still rumbling on... All of the major search watch sites are also reporting a lot activity, some showing gains/recoveries, some showing massive drops. Algaroo MozCast Serpwatch

    | TimHolmes
    0

  • Great this helps, thank you!

    | BeckyKey
    0

  • Perfect thank you!

    | BeckyKey
    0

  • Hi there, It's known that image search is rather random than understandable. Also, there has been some movements in the algorithm in the past 5-6 weeks. Check two of the most used sources: Mozcast.com Agoroo.com Also, have you checked if what images where the one ranked? Is it possible that those images where updated? Or that the newly ranked are more optimized? Have you noticed any changes in your server? Loading time might be a plausible cause. Best Luck. GR.

    | GastonRiera
    0

  • Hey Matt! Hmm, I don't think I've used Keyword Explorer for that purpose, yet. I still need to go back and really dive into the tool. I'll check it out! Thanks!

    | WWWSEO
    0

  • Thanks! Still not 100% sure what caused it but I'll keep looking!

    | BeckyKey
    0

  • Update: In addition to the following happening on our /shop/ subdomain (the bread and butter of the site): 1)  Stupidly moving it to shop.domain.com for 2 months, redirecting everything there, then deciding to move it back to domain.com/shop/ .... 2)  Developer failing to enable canonicals resulting in the new shop install having 4+ duplicate pages for every product for about 5 months. I have now found that the default setting for Magento store software is a 302 redirect for the 'Auto-redirect base URL' option.  Our base URL changed from HTTP to HTTPS.   This means that probably for about the last 9 months, our store home page has been 302'd  (no link juice passing, and way too long to use a temporary flag like this). This 302 is Magento's default option, and my developer failed to point out the devastating effect it could have on rankings if we didn't change it to "301". Not sure if this has played a role in our lost rankings, as our store is just a sub-section of our site, and I have no idea how I am going to fix this and tell Google "Wait!  Here's a 301 instead!  Please restore our juice!" 

    | HLTalk
    0

  • Hi there. Your question is not really a question, rather your thoughts maybe? Anyway, Google News rankings are based on, yes, typical SEO signals, but, more than anything, time sensitivity, mentions and social signals. So, an article, released today, which got traction in social media (shares, retweets etc), and some links will be performing at the top. So, base your optimization on that - the more newsworthy and shareable your content is - the better results you gonna get. Hope this makes sense

    | DmitriiK
    0

  • It would be safe to assume that Panda considers engagement metrics as part of its analysis of site/page quality.  For example, a page with additional UGC, comments, shares, etc.. is likely to indicate a higher quality of content than a page without any of those factors. Jake Bohall

    | HiveDigitalInc
    0

  • Hi there. Well, there are several things you can do to avoid cannibalization: When creating pages for individual webinars - do not post/duplicate content from product pages, just write a description what webinar is about + typical info about webinar - times, speaker info etc. If you do post duplicate content from product pages - use either meta robots or robots.txt to prevent indexing those pages. Add canonical links from these webinar pages to product pages to "redirect" all the juice and rankings. I think these are the ways to do it, #1 is the way I'd approach it. Hope this helps.

    | DmitriiK
    0

  • Hi Liz, What I would do as well is go with a solution around your robots.txt to make sure the crawlers will respect it and don't go on a hunch trying to find new URLs that are embedded somewhere else. Usually it's something you shouldn't worry about too much, it's just the crawler doing a good job trying to find more content/URLs on your site. Martijn.

    | Martijn_Scheijbeler
    0

  • Hi William, There's definitely some merit to this. They've been hard at work trying to improve the users' experience over the past couple years. Part of those improvements include better matching query intent to the content on the page. If they can tell from the query that the searcher is researching a product to buy, they'll give preference to the site that provides the most information, including pricing. I've got a client that only sells B2B, so they don't show pricing unless you're logged in. They've been slowly losing ground to other sites that sell similar products but are open to sell to anyone. It's a struggle we've been dealing with, and the only causation we've been able to identify is that the lack of pricing information. You might try including a price range, like 'starting at...' or something along those lines. It's worth testing out to see if you're able to recoup some rank.

    | LoganRay
    0

  • Are you filtering spam referrals in your Analytics data? A lot of SEOs noticed that spammy Analytics data declined significantly in the first quarter of 2016. (Unfortunately, the spam data has definitely come back over the last 2-3 months.) If you aren't filtering out spam and the site had a lot it in 2015, that might account for a 15% drop in overall traffic reported. Your points are valid, but cover your bases to ensure there are no hidden issues. You should perform a technical audit and risk assessment of the site if you haven't done so already.  Check for crawl issues, page speed, content duplication, and risky backlinks (to name a few).

    | LauraSultan
    0

  • Hi there When you did these redirects, did you properly map them and follow migration best practices? Failing to do so can lead to a major drop. I would also make sure that you are following international SEO best practices and making sure you are taking advantage of proper indicators for country / language specific websites and content. I would also follow up on backlinks, citations, and listings, and make sure that they are properly pointing to their new URLs and structure, that way you are getting the full effect of you backlinks. Following the resources above and these backlink strategies should help you get back on track. Let me know if you have any questions or concerns! Good luck! Patrick

    | PatrickDelehanty
    0