Category: Intermediate & Advanced SEO
Looking to level up your SEO techniques? Chat through more advanced approaches.
-
Robots.txt Disallowed Pages and Still Indexed
And don't forget to remove disallow in robots.txt first, if you want to remove it from index. Because if you add meta nofollow while the page is disallowed it won't go anywhere, crawler will not check it and it will stay indexed. Allow > Add meta noindex > wait for it to be deindexed > Disallow
| Igor.Go0 -
Google Adsbot crawling order confirmation pages?
Hi Sam, I can see how this might be concerning. Without knowing your site, I can't confirm anything but answers to your questions: Bots have been known to "fill out forms" before and at least Googlebot has been known to find pages through the use of Chrome (a user using Chrome). There are many ways, but if you are sure that there is no link to it anywhere, I wouldn't worry about it. No. That is what header codes are there for, to let the bots know what is there, what is forbidden, etc. Other than robots.txt, there isn't any way to stop them from sending in requests. If it gets out of hand, you can try talking to AdWords directly, but more than likely, this is not causing an issue. Overall, I'd just let it happen. Let them get the 403 error and they'll figure it out. As long as this isn't showing in the organic index, you should be fine.
| katemorris0 -
Site map creator
Hi, Go indeed with ScreamingFrog if you're not on WordPress. If you are running WordPress go with a SEO Plugin like the one from Yoast. That will make sure new pages will automatically get added to the sitemap instead of having to recrawl your site very often. A sitemap can contain up to 50.000 URLs. So for now with only 1300 URLs I wouldn't worry about the size of the sitemap or having multiple sitemaps.
| Martijn_Scheijbeler0 -
Local business with two separate websites...what to do?
Hi Ricky! If the client is determined to keep both sites, here is the only way to be 99% safe doing so: Only 1 site can list NAP. The other must not list NAP. You do not want any part of NAP shared between 2 sites. Do not build citations for the NAP-less site. Remove them where they exist. Zero shared content between the 2 sites. Define a totally different purpose for the existence of the second site. If site A is about the business, site B could be about something else, like maybe a community information portal about health care or something. Do not interlink between the 2 sites. If it's necessary to have a phone number of some kind on site B, it must be unique and must be answered with a different brand name than that for site A The above should remove concerns about loss of authority, citation inconsistency and duplicate content. Quite a bit of trouble, though, and may not be worth it unless the client can think of a really good, different purpose for the site B, other than promoting his business on it.
| MiriamEllis0 -
SEO: How to change page content + shift its original content to other page at the same time?
If the content of the two pages is related, you might also use an internal link (preferably within the content of the page: e.g. "for more information on KEYWORD"). That will help tell Google you consider ABC the more important/relevant. I still would try to find a closely related keyword for the lesser page if you can; the less keyword cannibalization, the better. Liz
| LizMicik0 -
Ranking 1st for a keyword - but when 's' is added to the end we are ranking on the second page
Hi Brett, I've the same situation with a client. We focused on the singular and ranked higher with the plural. Of course the optimization were done for the singular keyword. The reasons we found: CTR in serps, Search volume, lower in this case Several links with the plural keyword. hope it helps.
| GastonRiera0 -
Site-wide Canonical Rewrite Rule for Multiple Currency URL Parameters?
Added to note - you can also use GDC to inform Google which URL parameters should be ignored when indexing - can be a quick shortcut initially, but you'll definitely want to get rel-canonical properly implemented for Google as well as all the other search engines.
| ThompsonPaul0 -
6 months Later - 0 Domain Authority/Page Authority and losing Rankings
You have 1300 products and about 1500 pages with 6 months. How about traffic of your site? How about your content? How about backlinks? Did you submit link to Google? Did you use data migration and change SEO URL? If everything is good, it cannot DA, PA equip zero.
| Nayotanguyen0 -
Going from 302 redirect to 301 redirect weeks after changing URL structure
I would proceed with changing 302 redirects to 301 redirects. A 302 redirect is a temporary redirect that passes 0% of link juice (ranking power). If you change to 301 you'll start passing link juice again, and may see a recovery in ranking in time. Here's some more info on redirects as it relates to SEO: https://moz.com/learn/seo/redirection
| vcj0 -
23k Links from one doman pointing to a single page, good or bad?
Google has devalued sitewide links (and links from sidebars/footers) for a long time. Where you could get into trouble with this is if those "text ad" links don't include the requisite no-follow attribute. That would definitely get them flagged as manipulative and against ToS. Paul
| ThompsonPaul0 -
Mobile Canonical Tag Issue
Not the parameter, specifically speaking, You need to have the canonical on the mobile URL exactly match the primary URL of the non-mobile page. So removing the /mobile/ directory from the URL. (Technically, a parameter is something added at the end of a URL with a "?" so /product/product-code?sort=desc for example, which you didn't show on your examples. Canonical URLS should never include such parameters. In fact one of the main reasons for using canonicals is to fix issues with extra unwanted parameters being indexed as separate page. Didn't want to risk confusion here.) Paul
| ThompsonPaul0 -
Splitting One Site Into Two Sites Best Practices Needed
HI David, Wow, that is better than I would have imagined. If I may ask, what was the PA/DA of these pages and the appr avg moz difficulty of the kws? This site is like 45 DA 20 PA and usually does okay with kws under 40 difficulty. Best... Mike
| 945010 -
Setting A Custom User Agent in Screaming Frog
Setting a custom user agent determines things like HTTP/2 so there can be a big difference if you change it to something that might not take advantage of something like HTTP/2 Apparently, it is coming to Pingdom very soon just like it is to Googlebot http://royal.pingdom.com/2015/06/11/http2-new-protocol/ This Is an excellent example of a user agent's ability to modify the way your site is crawled as well as how efficient it is. https://www.keycdn.com/blog/https-performance-overhead/ It is important to note that we didn’t use Pingdom in any of our tests because they use Chrome 39, which doesn’t support the new HTTP/2 protocol. HTTP/2 in Chrome isn’t supported until Chrome 43. You can tell this by looking at the User-Agent in the request headers of your test results. [image: pingdom-user-agent.webp] Pingdom user-agent Note: WebPageTest uses Chrome 47 which does support HTTP/2. Hope that clears things up, Tom ujNvaad.png t49XhGi.png
| BlueprintMarketing0 -
How long for a Disavow file to take affect for a non-manual penalty?
As Stephen mentioned there's not an exact time frame on how quickly the changes will be seen. We at GetBackonGoogle.com submit many disavow lists for many different quality sites. Every site responds different, and it's important to continue building new, high quality links. Show the search engines you're making the proper changes and playing by their rules. Did you try having the site remove the link, before adding it to a disavow list? Often when you see the rankings jumping around that's a good sign. Search engines are moving your listing around to see where it fits properly. This is a great time to monitor the bounce rate and time on page. Make sure users are staying on the page and clicking through to additional pages, search engines will notice these things. Nick Adficient.com
| Chris_Hickman0 -
Google Search Console > Security Issues
Hi Sanjay, Appreciate you searching the community for similar questions. Do you have comments allowed on that page? Curious if it could have picked up any link spam from that. Agree with Brian to scan your website and reach out to your server provider to see if they have any more details on a potential hack. Do you happen to also have Bing Webmaster Tools? It would be interesting to confirm if they also picked up any sort of hack. Hope this helps!
| BritneyMuller0 -
YOAST Premium extensions vs Yoast SEO Free?
Yes, their useful plugins (cost a lot) but I think they are well made & unlike some other plug-ins, there updated very often. However what I recommend their video plug-in not really unless you had a very particular need. There Local and Woocommerce plugin's are excellent for what it does the news plug-in is significant buying the bundle may save you money, and I would recommend it if you have looked at everything and decided that's what you need. sorry about the delay let me know if that helps, Tom
| BlueprintMarketing0 -
Having 2 brands with the same content - will this work from an SEO perspective
Thanks David, will do!
| chill9861 -
Canonical Tags increased after putting the appropriate tag?
Hi Paul! It sounds like ThompsonPaul has answered your question. Are you all set, or can we be of more help?
| MattRoney0