Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Usually, the reason is because category pages tend to be light on content and thus, not as indexable. And sometimes category pages are noindex due to this. Google is likely looking at your product page and finding it more relevant, based on content, than the category page for people who are looking for "sleeping bags."

    | EricaMcGillivray
    0

  • Very good question. Yesterday, I was discussing the same concept with my seniors. For Search engines it always give preferences as per my knowledge & experience like .tv domain for Live TV industry .media & .news for  Media industry I saw that websites with these domains get some more advantage or preference in ranking, yes for ranking all SEO techniques need to be follow. & If we talk from the common people perspective they are familiar with .com, .co.uk, .es or .net domains. sites with TLD are more able to connect with users. but I will always try first to take .com, .in, .net, .co.uk type domains. Here are some expectations too like if name & domain is same as your company name Go for it like Company name is Visually & website url - visual.ly

    | sourabhrana
    0

  • Hi Angelo, Those location-specific domains are what we refer to as a ccTLD (Country Code Top Level Domain) and while they are a minor signal to both search engines and users, if they're being 301'd to the .com they won't be offering any signals at all. The only way they'd be passing strength is if they have backlinks pointing to them. If this is the case, you can expect ~80-90% of this strength to be passed via the redirect, the same as a standard link according to Matt Cutts in 2013. If you really wanted to put those ccTLDs to good use you could create a geo-specific site on each one and have them rank individually for their respective country however I generally recommend against doing this simply because you're tripling the amount of time and budget that's required since each domain has to stand on its own merit.

    | ChrisAshton
    0

  • It's not so much that this is going to hurt you but it also won't really help. You're far better off putting the effort into creating genuinely helpful content for each one. It's time-consuming but you only get out what you put in! You're not going to get a manual penalty for this sort of thing but it isn't exactly a great quality signal to have most pages on your site showing the same content over and over.

    | ChrisAshton
    0

  • Thank you so so much both of you it's saved me a massive headache. I had to close the script twice as there were two instances where the script tags were open and not shown. Ran the site through screaming frog and the Titles/Metas are showing up again. II have voted both of you up

    | MintySEO
    0

  • Hello, I've recently in the last 3 months implemented rich snippet breadcrumbs as well, my experiences so far is that Google does what Google does, I mean that 100% . Only a handful of our pages show up with breadcrumbs in SERPs, at 1st was only our blog posts, then noticed another section, but not always consistent, as time goes on, it seems more and more are discovered however still aren't as consistent as I'd like. So give it a month or so, if you don't notice any breadcrumb being pulled up, either check code or wait some more. Google is still showing page titles from two redesigns ago, that's after reindexing that certain page a few times, and doing a few other things. Google uses a toddler like A.I. that will do what it thinks is the " best " option for the relevancy of it's searchers.

    | Deacyde
    0

  • Thanks a lot Gianluca, I thought we were limited to languages existing inside the targeted country. I'll try it and check if there is any kind of ranking boost. For now, I have nothing set in "webmaster tools" or on the code.

    | rootsalad
    0

  • Brad, since the domain is not going to be yours, and you'll be unable to actually verify it in Google Search Console, then you probably won't be able to use the Google Change of Address Tool. However, if you are using 301 permanent redirects to redirect the content to your site, that should be good enough. Typically, when you use the Change of Address Tool you get more "credit" from Google, as it looks like they will pass most of the "link juice" over to the new domain--you don't get that when you only 301 redirect from domain to domain, you will lose some "link juice". I do recommend that you go ahead and use the 301 redirects.

    | GlobeRunner
    0

  • As others have said, maybe, maybe not. In my vertical, if "Macy's coupons" is the keyword, then everyone in the field puts that first in the title tag because the conventional wisdom is that it's the optimal placement. The problem with that strategy is that everyone's title tags are nearly indistinguishable from one another. Literally nothing in the column stands out from anything else. Blah. When we added a dynamic count to our titles, eg. "7 Macy's Coupons", and wrote them so they remained keyword rich while also reading more naturally, we observed two important things: 1. Our rankings did not change at all. 2. CTR went up. The working hypothesis is that the number helps us stand out and the natural structure is easier to read and digest. It's optimized for human eyeballs just as much as it's optimized for ranking. That's just one example, of course, and I'm sure others can line up with examples where the slightest change to a title tag knocked them back to page 30. I'd recommend testing it out.

    | BradsDeals
    0

  • Sounds like you've found the issue, or at least the main one. I would wait on the dev site to be removed before starting over with a new domain. KJr

    | KevnJr
    0

  • Yup.. also don't forget that robots.txt is just a "recommendation" for robots. they do not obey it Basically Google does what ever it wants to Also if you want to block a folder so its inner content wont be "accessed", in case anylink will point to this page, even if its coming from outside of your domain, it will be indexed.. Although the content of it wont be shown on search results but it will show up with a notice stating that the site content is blocked due to the sites robots.txt.. best of luck!

    | Yoav-Blustein
    0

  • Hi  YotpoKaiser, Thanks for the question. I think this brings up a really good point when it comes to the review platform space. As you mentioned - it appears that Yotpo is doing this to track clicks back to you (that's why they 301 redirect). If I were in your shoes, I would take one of 3 approaches (from least to most risk-averse): Don't worry about the links - There are plenty of websites out there that spin up copies of your website that the engineers at Google are brilliant enough to filter those out of your Google Webmaster Tools (GWT), they'll eventually get these Yotpo ones as well. Email the support team at Yotpo and see if they can "no-index, no-follow" the 301 redirect links that go back to your website. This is happening to you and it's happening to everyone else who is using the system. If they really want to provide a long-term review solution, this is INVALUABLE feedback that they can get. Disavow the links. As Eric and Deacyde above mentioned - you can go through a full analysis of the links and disavow them or disavow the entire domain. I like taking a data-driven approach, so here's how I would decide: Look at the impression data in GWT from Before/After you started using Yotpo. Look at a few of your best performing pages, that also have reviews pointing to them (very important), and see if (seasonally-adjusted*) the impressions are down. If you started using Yotpo>90 days ago, you cant using this technique due to the data limitations in GWT. I would pick a handful (maybe 10) pages to review first - look at ones that are the highest traffic/impressions. If Impressions have bottomed out (seasonally adjusted) for these pages - disavow the domain AND email the support team at Yotpo. Here are the instructions straight from Google. Use the domain:yotpo.com flag in your file. Look at the Organic session/user data in Google Analytics (GA) from before after you started using Yotpo. Use the same criteria here as you would GWT, about 10 pages, that have a decent number of monthly visits (>300). If your data is small - this probably won't help much as any change could cause a big swing in it. If your Organic sessions have bottomed out (seasonally adjusted) for these pages - disavow the domain AND email the support team at Yotpo. If you are unsure about #1 and #2 - email the support team at Yotpo and ask them to noindex/nofollow their 301 redirects. They owe it to you as a customer NOT to mess up your SEO. If you are ok with a little risk, then don't worry about it for now and monitor continuously. A word on seasonally adjusted data - I worked with a Ski Gear/Equipment company in the past and we saw very dramatic swings in traffic from November - February. You are probably seeing something similarly. If your work with Yotpo started during the "high season", I'd recommend seasonally adjusting it (divide the monthly session data by the % that Google Trends gives for that target keyword, in country. Yes - it's ALOT of work, but it will give you a precise idea). Since you posted this question in April, I presume that's soon after you started it and I would recommend only having your "before" traffic start sometime in Mid-March, after the high-season and into the shoulder season/summer. Weather and all other seasonality does play a factor here, but without the data telling you what's going on, you could either over/under panic and no one likes extra anxiety! Let me know if this helps!

    | RFLTyler
    0

  • Answered my own questions: https://developers.google.com/webmasters/control-crawl-index/docs/robots_txt?csw=1#file-format "A maximum file size may be enforced per crawler. Content which is after the maximum file size may be ignored. Google currently enforces a size limit of 500kb."

    | ThomasHarvey
    0

  • Hi Paul, This question is marked as "answered," so there aren't many SEOs reading it, unless they have the same problem as you! Please post a new question, so we can get the right people to answer. Best, Kristina

    | KristinaKledzik
    0

  • I have mocked up a small URL hierarchy pic to illustrate the two above cases: Type -> Geography [image: nuHkEK6.jpg] Geography -> Type [image: 35pPdq0.jpg] EDIT: I realized afterwards that the **Geography -> Type **illustration is incorrect in the second leftmost column. It should say domain.se/city-1/neighborhood-1/street-address-1/type-2 instead of domain.se/space-type/city-1/neighborhood/street-address-1/type-2, in the second leftmost column and second farthest down row. It should also include internal links inbetween that box and the one to the left. nuHkEK6.jpg 35pPdq0.jpg

    | Viktorsodd
    0

  • Hi Bob, This isn't really the best place to ask this question since the valuable contributors here aren't here to self-promote or actively pick up work from what is essentially a help forum. To point you in the right direction, Moz does have a Recommended Providers list which is working checking out.

    | ChrisAshton
    0

  • The most popular sitemap plugin for Magento appears to be this one: https://www.magentocommerce.com/magento-connect/xml-sitemap-generator-splitter.html, the XML Sitemap Generator & Splitter.

    | GlobeRunner
    0

  • Thanks Martijn. That makes a lot of sense. I'm working with small websites, but hopefully I will be moving on to bigger fish

    | MarketingChimp10
    0