Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • +1 for EGOL I would play with the pricing strategy instead of using noindex and nofollow on my site. These unwanted service pages might have valuable Page Authority and pass link juice in internal navigation, so noindex and nofollow can potentially hurt the overall organic search performance of your site. If you don't want Google to crawl these pages looking for new information, simply block crawling in robots.txt but leave them in Google's index.

    | Gyorgy.B
    0

  • You mean "feature snippet"? http://www.hmtweb.com/marketing-blog/featured-snippets-from-seo-sem-industry/ http://searchengineland.com/swapped-losing-google-featured-snippet-case-study-228899 Honestly - no one knows. But you can implement semantic ItemList for this and this could trigger featured snippet for your page. PS: In first article WS have 9k featured snippets, but rank for 580 queries. LOL. -> https://twitter.com/glenngabe/status/689807106375979008

    | Mobilio
    0

  • Unless there's a reason to take the 301s down, don't. If you don't want to have to wade through them all when you're editing your .htaccess file you could monitor your server logs which should show all requests & responses your server receives & sends. When no requests for a URL, (and therefor no 301 responses happen), come in for some long period of time, (a month? 6 months? a year?), then you may be able to safely remove the 301 directive from your .htacess, (or wherever you configure your 301s).

    | 4RS_John
    0

  • Hi Andrew This is very helpful thank you, I have already put together a case for improving our page speed so it's something I'll push harder with the developers. I am also working on a section for the site which will include user guides and helpful articles so this is great Thank you!

    | BeckyKey
    0

  • Run Screaming Frog on your subdomains and check the Images tab in the report, then sort by image size and you'll find the large images. Download Screaming Frog from here: http://www.screamingfrog.co.uk/seo-spider/

    | Gyorgy.B
    0

  • Yes. Disavow needed for each site (http/https).

    | Mobilio
    0

  • At this point, other than "It's not Penguin, probably" we don't have much insight into what's been going on over the past week, other than that, as Peter N. said, multiple tools are showing rankings shake-ups. If you're talking about a total loss of top pages, though, I think it can be premature to assume it was an update in play. I'd definitely thoroughly check the technical aspects. Are these pages still being index? Are they being cached properly? Do they show up for longer-tail or exact-match terms (in quotes) - in other words, have they dropped in ranking or are they ranking for nothing at all? The more you can pin down, the better. Unfortunately, it's very hard to speak in generalities and tell you what factors were involved in this week's updates. It really takes a deep dive into the site(s) in question.

    | Dr-Pete
    0

  • Thank you for the responses, I think that tells me what I need to know!

    | dbaxa-261338
    1

  • Yes - there is bug in your robots.txt. You should wrote some as: Disallow: /?display=table or: Disallow: /?display=*

    | Mobilio
    0

  • Hi Martijn, Yeah, not planning to block them from robots.txt of course. By blocking, I meant reducing the crawl rate to ZERO temporarily to make sure we're not creating any URL related confusions for bots. But, this might not be a good solution for our customers as customer might be redirecting to /new-url for the first hit, which might give him an error for in the next session.

    | _nitman
    0

  • Oh, wow - if you're talking a couple of years ago and major ranking drops, then definitely get aggressive. Remove as many as possible and Robots No-index them. If you've got the Robots.txt directives in place, Google shouldn't put them back (although, from past experience, I realize "shouldn't" isn't a guarantee). If you're down 90%, you've got very little to lose and clearly Google didn't like something about that set-up. Unfortunately, that's about the most drastic, reasonable option. The next step would be to start over with a fresh domain and kill all of the old domains. That could be a lot more hazardous, though.

    | Dr-Pete
    0

  • That's the thing it's not a topic necessarily, but we still want to rank for those terms. I guess instead I would say, if we don't publish a unique landing page for those highly competitive terms, what is the other best ways to influence those terms?

    | JustinMurray
    0

  • XML sitemap is well defined here: http://www.sitemaps.org/protocol.html But i can quickly resume: limitation up to 50000 URLs and up to 50MB as file. If you need more you can split them as sitemap index with several sitemaps. sitemap index are up to 50000 sitemaps and up to 10MB as file. lastmod, priority and change frequency didn't play HUGE role anymore: https://www.seroundtable.com/google-lastmod-xml-sitemap-20579.html https://www.seroundtable.com/google-priority-change-frequency-xml-sitemap-20273.html but just keep them to be fully formatted. sitemaps can be compressed (gzip) sitemap must be UTF-8 encoded but beware of entities - Ampersand, Single Quote, Double Quote, Greater Than, Less Than. You must replace them with % char codes. you can put sitemap location in robots.txt. You can place there also few sitemaps. Sitemaps can be located on 3rd party servers too. I think that this is most important in XML sitemaps.

    | Mobilio
    0

  • Even if you implemented all the tags correctly there is no guarantee that Google will show the them in the SERP's - the choice whether or not to show them is entirely up to them. I guess you already checked if all the tags are properly implemented (if not - check here or here) You could also check this page for the usage guidelines (bottom of the page). Again - even if you are doing everything correct & you meet the guidelines - it's entirely up to Google if the snippets will be shown or not. You can't control it. Dirk

    | DirkC
    0

  • Thanks a lot for your time in answering the question. I just found that google indexed almost all of my urls. It just took a few days to update. This platform is of great help. -Gautam

    | gowthamsm
    0

  • Will, I'm not familiar with the CMS you're using, but to answer your question about rel=canonical, no, that is not an instance of when to use that tag. Canonical tags are used for times when duplicate content is unavoidable, such as sorting a product category page and having different URL parameters based on the sort type.

    | LoganRay
    0

  • I think that A1 Sitemap Generator support this function.

    | Mobilio
    0

  • If you just will display icons then it's OK. But it's dangerous to display text with display: none: https://www.seroundtable.com/google-display-none-20626.html https://support.google.com/webmasters/answer/66353?hl=en https://youtu.be/7y-m_jiayLQ https://youtu.be/B9BWbruCiDc https://www.seroundtable.com/google-hiding-content-17136.html https://www.seroundtable.com/google-hidden-tab-content-seo-19489.html Because hiding text give "less weight" and you may get into trouble.

    | Mobilio
    1

  • Hi Nico, Tim's example should get you what you need to mark up your aggregate ratings correctly. I wanted to take a minute to address your other questions. Anything between the and tags in an example is something you should customize to your content. So in the example above, <spanproperty="name">Super Book, you would replace "Super Book" with whatever the name of the product being reviewed is. For any example of Schema markup, if the example includes information that isn't on your page, you can just delete those properties.</spanproperty="name"> For Publisher markup, the "publisher" isn't the person who wrote the review, it's the website as a whole (that's you) that is publishing the content. In terms of whether or not Google will include the ratings snippet since it can't verify whether ratings are real, in my experience they will especially if you have a good volume of reviews.

    | RuthBurrReedy
    0