Category: Intermediate & Advanced SEO
Looking to level up your SEO techniques? Chat through more advanced approaches.
-
Removing Toxic Back Links Targeting Obscure URL or Image
Hi Kingsplan The problem here is that you would still have them pointing at your domain. Even if they 404'd, the links would still be hitting your site. I would: 1. Ascertain if they are actually toxic - look at SEMrush or similar (anything over 65 toxicity is worth removing) - we have had a lot from 'the globe' recently. I have disavowed them for some sites but honestly, they don't appear to have done any harm to others. 2. Email the 4 or 5 domains and ask them to remove the links. 3. If you don't get a response after 2 weeks or so simply add those domains to the disavow file. Regards Nigel
| Nigel_Carr1 -
Can I cover any topic(s) or are they some required ones
Thank you for your detailed reply.
| seoanalytics0 -
How to compete against search terms that use geo-modifiers?
Glad the reply was helpful, Chook1. You're in a tough spot and I wish there was an easier answer to give you, but know that I'm wishing you good luck in this venture. May your as-yet-unknown beautiful places for bike tours become world-class destinations in the future!
| MiriamEllis0 -
Canonical - unexpected page ranking
Thanks Martin, looked at Moz's guide on Canonicals. Have a much clearer view of them now. Thanks for the feedback we will work on the pages we want to rank and monitor them. Ian
| Substance-create0 -
Canonical and Alternate Advice
That would normally be the case but not tonight. LOL, I am picking up a lot of the UK Q&A I will be at BrightonSEO and search love London if any of you guys will be in the area I'd love to grab a pint? sincerely, Thomas
| BlueprintMarketing2 -
Move to new domain using Canonical Tag
This is exactly right and is a great answer. Canonical tags stop content duplication from being a problem and can alleviate content duplication related devaluations (or in extreme cases, penalties) What canonical tags don't do anywhere near so well (if at all) is transfer SEO authority from one page to another. If OP did what they were suggesting, the risks would be (1) Google interprets the canonical tags wrong (2) Google starts ranking pages on the new site instead of the old pages, but (critically) without any appended backlink equity (3) all rankings are then lost on both sites I'd be extremely, extremely hesitant to deploy in the OPs specified manner and I think that Nigel is 110% correct here
| effectdigital0 -
How Many Links to Disavow at Once When Link Profile is Very Spammy?
As Michael Edwards pointed out you need to spend some time look at the links & sites yourself to ascertain their suitability.
| jasongmcmahon0 -
Why google does not show my site on my branding keyword?
Same Issue with My web Site, How Can help me Fix This Issue, I have try More SEO Tools and, I think I have pass on page Optimize [url=https://tvlankalive.com/] Sinhala Teledramas[/url]
| ipl7440 -
When using ALT tags - are spaces, hyphens or underscores preferred by Google when using multiple words?
YES!!!! Always use lowercase for filenames because if you use Upper and Lower (sometimes called camel case) for your internal and menu linking Google will crawl it and index the U&l. Then the fun begins when you have to match your sitemap to that!
| gravymatt-se2 -
Potential downside of removing past years' calendar events
Thank you for your response. The old pages give us very little traffic--as few as ONE page view over the past six month Here's an example: https://www.landmarkschool.org/calendar/day/2018-01-06 1 page view (obviously NOT a lot of traffic) https://www.landmarkschool.org/calendar/day/2019-01-06 Another issue is that these events are generated from the calendar but don't have /calendar/ in the url, and I'm getting penalized for duplicate content. https://www.landmarkschool.org/blue-day-47 https://www.landmarkschool.org/blue-day-48 https://www.landmarkschool.org/performing-arts-tech-weekend https://www.landmarkschool.org/performing-arts-tech-weekend-0 https://www.landmarkschool.org/performing-arts-tech-weekend-1 https://www.landmarkschool.org/performing-arts-tech-weekend-2 Any advice is appreciated!
| BGR0 -
Site migration/ CMS/domain site structure change-no access to search console
If your architecture is changing, (e.g: from non-www to www, then from HTTP to HTTPS) - just be careful that your developer's logic doesn't start 'stacking' redirect rules You want to avoid this: A) user requests http://oldsite.com/category/information B) 301 Redirect to - http://newsite.com/category/information C) 301 Redirect to - https://newsite.com/category/information D) 301 Redirect to - https://www.newsite.com/category/information Keep your redirects **strictly origin to final destination, and you'll probably be ok! **In the case of my example the redirect should go straight from A to D, not from A to B (hope that makes sense) Install this Chrome extension so that you can see redirect paths in your Chrome extension buttons menu. It's very, very handy for testing redirects
| effectdigital0 -
301 Question - issue
It's probably just taking Google a while to process all the changes. Really your 301s should point to the same content, not just all go to the homepage. If you had pages showing on two sites, the pages do 'really' exist on one site but weren't supposed to exist on the other. Correct the 301s so that they point from the URLs on the affected site, to the exact same pieces of content on the site where they were originally located (where they were supposed to be located) If that fails use the HTTP header and X-robots (not no-index tags, fire the no-index directive from the HTTP header instead of the HTML) to tell Google not to index those URLs on the 'affected' website. In conjunction with that, alter the status code of all bogus URLs on the 'affected' site to 410, which is stronger than 404 (it means: GONE - not coming back, 404 just means temporarily gone but will return...)
| effectdigital0 -
Stuctured data for different sized packages
Hey! No problem.. Just trying to figure the best way to do this too! Thanks for the detail reply. All valid points - regarding indexing thin content, and showing customers more than 1 size - but those can be solved. Lets look at this with an actual example... Redbubble.com (an Alexa top 1000 website in the US) are selling a throw pillow in different sizes and different types. The costs are different based on the size and type chosen. This is their main product page for this product: _https://www.redbubble.com/people/straungewunder/works/25221192-familiar-sooty-owl?p=throw-pillow_ On this main product page they are sending the customer to a default size (16*16) and type (cover only) option.. But as it is a dropdown, the customer is not stuck with just 1 size - he/she can choose multiple from dropdown. And on this same page, they have this schema markup. ..... Then they have duplicate pages for all the other pricing options. E.g. for size (26*26) and type (cover only) - this is the URL _https://www.redbubble.com/people/straungewunder/works/25221192-familiar-sooty-owl?p=throw-pillow&size=26x26&type=cover-only_ and the schema markup is identical to the one list above, _except for the price. _ All these pages are all exactly similar except for the default size and type chosen, and therefore the price is different for each page. Duplicate pages are not a problem as they use canonical tags properly. All the pages have this canonical tag. The canonical tags point to the original page always. Regarding indexing the pages - **only the original page is indexed. ** If you go to Google and search for their main product url - it comes up on Google. If you go to Google and search for the other product pages with different pricing options - they are not indexed. So **Google isn't wasting crawl budgets on these duplicate pages.**But in your case you would index more pages if the search volume is high for different quantities (and then also change H1/title/meta tags respectively for these indexed pages). Also, updated this as a blog as I think more people have this problem and will find this useful. Apologies if you have already considered this, but let me know if this still doesnt work for you.. Interested to know what you finally go with!
| PaperTrail1 -
Do bulk 301 redirects hurt seo value?
No, doing this at bulk shouldn't necessarily hurt them when you make sure that you're doing it the right way with the proper redirects in place. In the end, you see tons of sites that are going out of business or change domain names and move the data over to a new domain. So that's why your bulk action isn't such a big problem.
| Martijn_Scheijbeler0 -
Rebranded Website Uses a Forward Slash /at End of URS-Is This Considered a Redirect?
301 redirects will almost assuredly be utilised to keep this maneuver SEO-friendly. but wait! 301 redirects fail to translate 'most' of the SEO authority from one page to another, in two key situations. If the content is too dissimilar on the destination URL, 301s can fail to port authority across For you, this won't be a big issue as (from the sounds of it) the pages will be almost identical, byte for byte. The new pages may be very, very slightly larger due to having source code that contains more instances of the character "/" but that's not something which would phase Google at all Another situation where 301s can fail to move all the SEO authority across is when redirect chains occur. But you're just 301-ing "non trailing /" URLs to "trailing /" URLs, so it shouldn't be a problem right? Hmmm there are ways you could come unstuck here Let's imagine we have a hypothetical retail site called "buymyproducts.com" Let's imagine that a few years ago, the site used to be on HTTP (insecure) and has moved over to HTTPS (encrypted) All pages were influenced by a HTTPS-injecting redirect, let's create and example: http://buymyproducts.com/product-category/product was 301 redirected to https://buymyproducts.com/product-category/product (with HTTPS) That redirect rule now sits within the web.config or .htaccess file and waits for insecure requests, redirecting as appropriate Now we want a new redirect rule, and it will affect the page like this: https://buymyproducts.com/product-category/product will be 301 redirected to https://buymyproducts.com/product-category/product/ (with a trailing slash) That seems fine, but when the oldest architecture is queried, you'll end up with redirect chaining like this: A) http://buymyproducts.com/product-category/product will be redirected to B) https://buymyproducts.com/product-category/product (with HTTPS) which will then be redirected to C) https://buymyproducts.com/product-category/product/ (with HTTPS and a trailing slash) ... so as you can see, your redirects will begin to chain unless you foresee that problem up-front and write 'more complex' redirect rules that just connect A to C whilst entirely skipping B. If the site existed on the oldest architecture (no trailing slash, insecure / HTTP) for the longest time (say 7 out of 10 years) then it's likely that many of the best links will still be hitting the very oldest architecture in terms of link destinations. Those backlinks won't translate into SEO authority for your site (very well) if your redirects begin to chain-up To stop yourself from losing large chunks of legacy-authority, you'd have to do the redirects really well and ensure that your developer's rules do not ever begin to chain. If they are confident that they can avoid this chaining by writing much more complex redirect rules then go for it. If not, hold off
| effectdigital0 -
Is there a limit on back linking per week
I agree with Martijn. The idea behind what he's explaining is the pace and how acceleration on getting backlinks. Remember that Google has the most advanced algorithm detecting backlink schemes and tactics, they'd eventually get that your site is creating links against their guidelines. Hope it helps. Best luck. GR
| GastonRiera0