Category: Intermediate & Advanced SEO
Looking to level up your SEO techniques? Chat through more advanced approaches.
-
Google Indexing Stopped
Hi Jeffrey, Hard to draw conclusions without knowing the website. Is the site actually 2.3 million pages or did Google deindex a bunch of duplicate or thin content? I feel this could be what is happening. Are you seeing a decrease in organic traffic and/or rankings since this happened or were these thin-content pages that were not generating any traffic? An average load time of 4 seconds is likely not the issue. It's not ideal for UX (particularly on mobile) but I have seen Google fully index websites with higher average page load times than this. Were you getting these index numbers from Google Search Console or from performing a site search? ("site:") If you want to DM me the site, I can take a closer look for you. Joe
| Joe_Stoffel0 -
Internal Clicks and CTR. Is REL=canonical better than Noindex in this case?
Thanks James this has been so helpful. I've searched all matter of search terms but hadn't come across "dwell time" before. It's amazing how one search term can open up the internet for you. I learnt another relevant search term in this article - "Unicorns" are apparently high performing internal pages. http://www.wordstream.com/blog/ws/2017/01/25/dwell-time-seo. Thanks again
| Andrew-SEO0 -
Interested ranking results
Thank you for your reply but my pages are at the same level as all the other and don't have any links like all the others... The keywords are not less competitive either. So the answer is somewhere else... Do you have any other ideas ?
| seoanalytics0 -
Help.. there was a html and php version of my home page on my server for about a week. Now lost all rankings!
Hello again. I have just discovered this URL on a duplicate content checker site... http://www.aprilnites.com.au/?el.outerHeight({margin:true})/2:el.outerWidth({margin:true})/2) I have no idea where it has come from or how to get rid of it but it brings up my home page. Can anyone advise on this?
| GemmaApril0 -
What should I do if same content ranked twice or more on Google?
I would say the first thing to do is to send a thank you letter to Google the more times you show up, the more traffic you will get. If you have two pages with identical content showing Google will probably drop one in due course, you could consider restructuring one of the pages to go into the topic in more depth but you must keep it within context to avoid losing the ranking.
| seoman100 -
Finding non-linked brand mentions
You need one of the robust social media monitoring tools which provide advanced boolean search queries. Many tools only allow 3 boxes (include any of these terms, must include these terms, exclude these terms), and there is no way to imply proximity or use context. The only way to really trim the results to what is relevant is to use boolean logic. For example, if you were monitoring for amazon, you could exclude: (amazon AND (jungle OR brazil OR rainforest OR "rain forest")). Or use proximity like: (amazon NEAR/5 (shopping OR products OR ecommerce OR e-commerce)). Additionally you can look for user intent with phrases like: ("shopping on" OR "buy from" OR "need to shop" OR "research products on") NEAR/3 amazon Obviously these examples are just small samples of what can be done. Some queries for common words may end up being 2000 characters long to catch all use cases. But without the ability to create custom boolean queries that include nesting parenthesis, quotes for phrases and proximity operators like NEAR, the other tools won't help. In a previous job, I wrote a white paper explaining why Boolean is important for social media monitoring and provide some tips on what to look for in a social media monitoring tool: https://www.dragonsearch.com/files/social-media-monitoring-tools-boolean-search-query.pdf Good luck! Jannette
| JannetteP0 -
Homepage keeps disappearing and reappearing for important keywords every few days
If your site has a penalty, fixing it will require determining why the penalty was given. Most likely it has something to do with breaking the webmaster guidelines. That is best assessed by the webmaster himself, and if the webmaster can't do it then an expert at identifying Google penalties is the best bet. People who are not experts at penalty identification will be guessing and you don't want to guess, do the recommended work, wait to see if the site recovers, then see no recovery, guess again,work, wait, find nothing happens, etc.
| EGOL1 -
Google disavow file
To the best of my knowledge Google will read it pretty regularly (every day/several days). However, Google takes several weeks until the disavow request is processed
| KevinBudzynski0 -
Restructuring Areas of a Website
Your internal link structure does help your rankings, but the way I see this, there's not going to be any change to the pages that are actually ranking, right? If you remove the page that didn't rank for the keywords anyway, It shouldn't impact the rankings of the other pages. There are a couple of factors to take into account when you do this: Don't forget to 301 the link to, for example, the level above, in this case /key/, so that direct/referral traffic to that page doesn't get 404'd. Is there a lot of traffic on the /cupboards-lockers/ page now? If so, how does this traffic behave, does it flow towards the other pages before converting or do they convert on this page? How is the UX (bounce rate, time on page, exit rate etc.)? These factors can help you determine what you should do with which page. Make sure there are no other dead ends in your page: check the sitemap and scrape your site with, for example, Screaming Frog to prevent any 404's from showing up. Make sure there are no products assigned to that page specifically, creating an odd URL structure on product level or a breadcrumb with missing or dead links. Hope this helps!
| Justen_H0 -
Category Page as Shopping Aggregator Page
Hi Ramon van den Ende, Thank you for your reply. Good to see that someone else had the same thoughts. I agree this certainly is gaming the system but isn't that what we as SEO's have been doing since the start I do think that in practice it would work but again to to your point how long would it last for before big G cottons on and decides to stop showing the results. This is exactly the kind of discussion I was after. Alex
| Alexcox60 -
Http to https - Have we done it correctly?
The big problem is your redirection. At the moment, you DO NOT redirect people on the https website. Read more about the changes you have to make here -> https://moz.com/learn/seo/redirection. Basically, if you run on Apache, you need to modify your htaccess file and everyone who lands on the non-ssl version should be redirected to the https one. A quick Google search will give you examples of rules to include in your file. For example: https://uk.godaddy.com/help/redirect-http-to-https-automatically-8828. In terms of the questions you asked: you should modify the settings of the website and set the https as the preferred version. You shouldn't have two different sitemaps. The non-ssl one should not even work (it should be redirected as mentioned above) Of course your robots.txt version should include the https links. Again, the one without them should be already redirected. Hope this helps.
| iugac0 -
Potential issue: Page design might look like keyword stuffing to a web crawler
Ah, a very interesting question! I'd not be too concerned; you're loading the content in through a data attribute rather than directly as text. However, there are definitely a few options you could consider: Render via SVG feels like the safest bet, though that's going to be a pretty large, complex set of vectors. Save + serve as an image (and overcome the file size concerns by using WebP, HTTP/2, a CDN like Cloudflare, etc) Serve the content via a dedicated JavaScript file, which you could block access to via robots.txt (a bit fudgey!) I'd be keen to explore #2 - feels like you should be able to achieve the effect you're after with an image which isn't ridiculously huge.
| JonoAlderson0 -
25% of expired domains came with a Google manual penalty
The penalty mentioned in the article wasn't even a link based penalty... The Moz spam score looks at the quality of links pointing to a domain and has nothing to do with spam content on pages - so that score is irrelevant here. Having the penalty removed could have been as easy as clicking the 'Request a review' button so Google would check to see that the domain no longer contains any auto-generated spam. I would also have to ask why the person even wanted these domains? They are clearly up to some black hat SEO tactics and should probably stop complaining about Google penalties...
| davebuts0 -
SERP Ranking
Hi James, Thanks for the reply, I have checked the speed and clearly the speed could be better but in the UK it does seem to be there or abouts within the range. Also 404? those two pages are working perfectly for me. Would you suggest anything else?
| mg33dev0 -
Expired Domains for Back Links - any good?
Hi. But do you not effectively get the links from the expired domain if you redirect it?
| Heinwest0