Category: Intermediate & Advanced SEO
Looking to level up your SEO techniques? Chat through more advanced approaches.
-
Backlink "class=X-hidden-focus"
Sadly I don't know the answer to your question. I am very curious to see if anyone does have a concrete answer. I googled around a bit trying to learn but couldn't find any real solid information. My intuition is that they are not treated as no follow but are actually not considered links at all. Just a guess based on their name, no experience with them whatsoever. Or something a black hat does to hide links? That would be even more obvious though... Interesting!
| bendroid0 -
Site Migration Question - Do I Need to Preserve Links in Main Menu to Preserve Traffic or Can I Simply Link to on Each Page?
Hi OrioleOriole and effectdigital, apologies for the delay, but thank you for your help, we preserved the links in the end. I've marked your responses as good answers and liked them, thanks again
| ruislip180 -
If I use links intag instead of "ahref" tag can Google read links inside div tag?
Thanks a lot Alick300 for your kind response and reference.
| pujan.bikroy0 -
New blog on a separate server to the main website?
IMO they are creating problem for no reason. I can't think of any reason a separate server would be worth the headache. Just put it in it's own folder or subdomain if you really want to separate them. IMO.
| Jason-Rogers0 -
AU and US site needs Hreflang?
You will want to use hreflang, either in the pages or in the sitemaps of the two sites. I'm assuming that you want the pages from the first site to rank for searches in Australia, and the second site's pages to rank in US. And hreflang tags are how you will communicate that desire to the search engines. The AU site should have a self-referencing hreflang tag for "en-AU" and another hreflang tag for "en-US" pointing to the equivalent page on the sister site. You may also want to include an x-default tag as well. The US site should have a self-referencing hreflang tag for "en-US" and another hreflang tag for "en-AU" pointing to the equivalent page on the sister site. You may also want to include an x-default tag as well. And, you also probably want to geographically target the two sites (or at least one of them) in Search Console settings. Lastly, in case a visitor lands on the wrong site (the search engines don't always obey our directives), you will want some way to inform the visitor that there is a "more appropriate" site. This can be a modal dialog based on geo-ip, or it can be just part of the site design, or however makes sense for your business.
| seoelevated0 -
Homepage with and without language subfolder
It is the "normal" way to do it, often times pages are looking for region or browser language and redirect, but thats not the question here. Yes you may need to take care about some links you have control about in any way (yelp, linkedin, socials - whatever it is). But I bet you link to all languages. Your Homepage for example links to the other language-versions of the homepage. So the "juice" you thought is going to one language, would be sent to the other languages in smaller portions. How it should be, so no Problemo..
| paints-n-design0 -
Old content with trailing `/` - What should be my new approach?
Thank you for your response @Optimal_Strategies. Of course, the right way to get rid of all the trailing slashes throughout the site is to enforce the rule in NGINX configuration. However, the real question we have is the right strategy - 1. Whether we should leave existing content on the site with trailing "/" and continue building rest of the site with URLs without trailing slashes. OR 2. Enforce non-trailing slashes site-wide and implement a 301 redirect to all the OLD urls to new urls.
| KaustubhKatdare0 -
Author Credit when Using Existing Article
In my opinion better to ask the publisher to create unique content related to his topic.But yes you can use the same article and apply canonical tags.But for me to get benefit both you in terms of link juice try to accept unique content .
| invechseo0 -
Pointing additional domains at your main one
Furthermore piggybacking on Colemckeon and Kevin Budzynski If there are any topical authoritative links, reach out to the webmaster by email or call and ask to change the URL to yours. More than likely they will comply. I doubt this may be the issue but you never know. -Jonathan
| JonAlonsoCNC1 -
Whats up with the last google update.
This has been a theme since Aug 1st. People are Yo-yoing around. I watched a site go to 8th page to 2nd back to 8th and finally crawled its was back to the 2nd. I would wait a week or so and let everything cool down. I have been extremely perplexed by the recent algo updates. I am hoping for some predictability as well in the future. What I do see as a trend is sites with a lot of backlinks from a poor quality site ( Such as SEO by Company X ) are getting hit hard. Seeing a lot less backlinks from high authority sites ranking sites very well.
| Colemckeon0 -
Should I submit an additional sitemap to speed up indexing
I personally would not. There is no need to submit an additional sitemap when you can just ping google to crawl your site at anytime. Google will put you in a que and crawl you almost immediately. I would not recommend abusing this. Only ask Google to index your website when you have new content or Google is taking to long to index current content. The more fresh content you provide, the more Google will re-visit your page and continue to index you Hope this helps!
| Colemckeon0 -
Getting Google to index our sitemap
Now I can see Sitemaps, loadings takes time ... a lot and they look weird, but maybe ok. But there is stuff in it, wich I wont like to have in Google-Index. Northeless - whats the message in GSC? (opend in Chrome, Firefox and on my pixel as well - the first one is looking good, all linked once had the error, now they are differnet from each other (with Linebreaks or without, with space or without) but contain links at least) Is the site on a subdomain for pages on a different domain? (didn't saw that) - that makes it way more tricky ...
| paints-n-design0 -
Indexing folders in google
301 redirection is the best way to handle those sub-domain.
| Rajesh.Prajapati0 -
Should I exclude my knowledge center subdomain from indexing?
Well, the advantage of having sub-domain is that you can target a specific audience with that specific sub-domains since Google'll treat it as it's won unique site. The biggest disadvantage is that if you don’t do it right, you will not get the expected results you could even draw traffic from your main domain.
| jasongmcmahon2 -
After hack and remediation, thousands of URL's still appearing as 'Valid' in google search console. How to remedy?
Google Search Console actually has a URL removal tool built into it, unfortunately it's not really scaleable (mostly it's one at a time submissions) and in addition to that the effect of using the tool is only temporary (the URLs come back again) In your case I reckon' that changing the status code of the 'gone' URLs from 404 ("temporarily not found, but will be returning soon") to 410 ("GONE!") might be a good idea. Google might digest that better as it's a harder indexation directive and a very strong crawl directive ("go away, don't come back!") You could also serve the Meta no-index directive on those URLs. Obviously you're unlikely to have access to the HTML of non-existent pages, but did you know Meta no-index can also be fired through x-robots, through the HTTP header? So it's not impossible https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/404 (Ctrl+F for "X-Robots-Tag HTTP header") Another option is this form to let Google know outdated content is gone, has been removed, and isn't coming back: https://www.google.com/webmasters/tools/removals ... but again, URLs one at a time is going to be mega-slow. It does work pretty well though (at least in my experience) In any eventuality I think you're looking at, a week or two for Google to start noticing in a way that you can see visually - and then maybe a month or two until it rights itself (caveat: it's different for all sites and URLs, it's variable)
| effectdigital0 -
Can images with a company logo get included on featured snippets?
This is also relevant to knowledge-graph boxes and the images which Google compiles into those. Not quite the same thing as the featured snippets but still pretty neat Typically it's a good one for industrial materials or chemical compounds, e.g: https://www.google.com/search?q=Poly%28methyl+2-methylpropenoate%29 https://d.pr/i/lQLl1v.png (screenshot) or https://www.google.com/search?q=polyethylene https://d.pr/i/HwWXoc.png (screenshot) ... there are lots of 'material' based knowledge graph entries, which pull images from other sites in, in order to build up a good view of what the material is. Some people actually find the images from their sites which Google injects, and edit them to use a small and unobtrusive watermark (the trick is not to get too greedy, or Google notices and replaces the image from your site with an image from another site!) Obviously where branded products and compounds / materials converge, it's easier to get some branding showing in the tiny little images: https://www.google.com/search?q=cbd+oil https://d.pr/i/jLn5uQ.png (screenshot) A lot of these actually come through Google's image search results You don't see so many successful injections in this particular area, these days though This one is quite a neat example: https://d.pr/i/582BzT.png (screenshot) and this one also: https://d.pr/i/kaibeC.png (screenshot) ... bit of free advertising for BirdsEye and Green Giant there
| effectdigital0