Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Hi, thanks for taking the time to answer, but honestly this does not address the point I am trying to make. Lets see if I can be more clear with an example.Let's suppose I have an affiliate WordPress website that reviews cameras. OPTION 1: My URLs at the moment are "POST" type: /best-camera /best-sony-camera /best-fujifilm-camera /fujifilm-xt20-review /canon-60d-review /nikkon-750d-review OPTION 2: Other websites in my niche have also a wordpress website and they posted the same topics as "PAGE" type: /best-camera /best-camera/sony /best-camera/fujifilm /review/fujifilm-xt20 /review/canon-60d /review/nikkon-750d Here, best sony camera and fujifilm review are wordpress child pages of the parent pages review and best cameras Question: How much SEO advantage do these people (the PAGE type people - option 2) have over me (POST type website - option 1)? Is it gonna be worthwhile for me to re-design my website and have all my option 1 re-organized as option 2? When I started my website many years ago I didn't have any idea what I was doing so I just did it all as posts. But these people clearly are experienced and started the website with the right foot. So I am wondering if it's gonna be worthwhile for me to transform my website structure from option 1 into option 2.

    | fabx
    1

  • Hi Shirapn I think it is perfectly acceptable to use an accordion to 'read more'. However, according to Rand Fishkin, who did a study on this a couple of years back, the page with the hidden text will not rank as highly as it did with the page visible. They will not treat the hidden text with the same weight as it would if it were visible. https://moz.com/blog/google-css-javascript-hidden-text It is far better to have the text visible. One of the ways around this would be to have a couple of lines at the top of the page and then add the rest of the text under the category content. I hope that helps - watch the above video - it's very clear. Nigel

    | Nigel_Carr
    0

  • Hi, I read and watched the video, but I don't see why I should add dofollow links? how does it help?

    | fabx
    0

  • Hi there, I'm considering that you have over 500k URLs, to be worrying about crawl efficiency. If you have less than that, please don't worry. Having 404s is completely fine, and google will eventually lower their crawl frequency to those pages.  Blocking them in robots.txt will cause to google stop crawling them, but never to never remove them from the index. My advice here: don't block them in robots.txt As Rajesh pointed out, you could force those 404s into 410 to tell Google that they are gone forever. Yet, Google said that they treat 404s and 410s as the same. John Mueller said over a year ago that 4xx status codes don't incur in crawl wastage. You can check it our in these Webmasters hangout notes - Deepcrawl Hope it helps, Best luck. Gaston

    | GastonRiera
    1

  • Those URLs don't index because they result in 301 redirects. Google only indexes the redirect destination URL, this is to cut down on people clicking SERPs (Search Engine Ranking Positions) and then getting redirected (slow) instead of heading straight to the web-page https://www.vulyplay.com/en-AU/swingsets (no hyphen) 301 redirects to https://www.vulyplay.com/en-AU/swing-sets (with a hyphen) https://www.vulyplay.com/en-AU/trampolines (ends with "s", pluralised) 301 redirects to https://www.vulyplay.com/en-AU/trampoline (non-plural) You can see Google has indexed the redirect destination URLs here https://www.google.com.au/search?q=site%3Avulyplay.com%2Fen-AU%2Fswing-sets and here https://www.google.com.au/search?q=site%3Avulyplay.com/en-AU/trampoline

    | effectdigital
    0

  • It probably will. Linked or embedded resources (files, PDFs, images) should be accessed via HTTPS rather than HTTP. This remains true, even if your HTTP links then redirect to HTTPS

    | effectdigital
    0

  • Thanks for taking the time to answer. Agreed. It's confusing at best. It confused the heck out of me when I was deconstructing the behaviour. We generally get indexed faster than 2-3 days. Last time I checked the average time to index was around 40 minutes. Guess that's because the engines know our content changes frequently. _1- If the products on your site are selling within minutes, then why are you focusing your attention on how Google will index them?  _Most of our purchasing customers come via Natural Search. 2- As the products sell out within minutes and after so the redirection is stopped, then why would that affect how Google ranks your site?  I should have been clearer: t****he queue will trigger after a threshold is reached, not when product is sold out. But if it's a particularly high demand product, it could sell out before threshold dips below that configured for the queue. Good suggestion about opening queue in a tab.I will explore that option.

    | TSEOTEAM
    1

  • Image Format I believe the preferred performance is WebP. But I usually try to use png over jpg. Compressing Images I think what might work best in your case is some sort of plugin like WP Smush. If you have a ton of images I'd invest in a tool or plugin that dynamically compresses images as they are uploaded to your site. I like WP Smush because it also strips out the metadata associated with images along with compressing them. If you have a ton of images it could be an attractive solution for you that you can scale. Outside of another plugin, you could try some sort of cloud-based solution to dynamically compress images before you upload them. I've tested an open-source image compression tool called Caesium in the past. This tool reduced some of my images by almost 40%. It performed better than the plugins I was using but I'm not so sure it would be a scaleable solution for you. Out of curiosity, how bad are your load times? Are you currently running into site speed problems or are you trying to make incremental improvements?

    | JordanLowry
    0

  • Yes, you should target to your niche backlink such as you have a website where you are sharing only tech articles and now you want to rank them so, you must be aware from spamming links that will increase your spam score. Same as I am caring for my own website: AmazeInvent: Internet Technology 

    | ThomeedisonSam
    0

  • Thanks for the answer! Last question: is /wp-admin/admin-ajax.php an important part that has to be crawled? I found this explanation: https://wordpress.stackexchange.com/questions/190993/why-use-admin-ajax-php-and-how-does-it-work/191073#191073 However, on this specific website there is no html at all when I check the source code, only one line with 0 on it.

    | Mat_C
    1

  • Hi there, When it comes to internationalization, there is a simple solution for google search: use hreflang tags. Here what they say: Internationalization and multi-language websites - google search help. Hoping that you have different URL for each store and that you've places hreflang tags correctly, you should not be worrying that much. OF course, you have to check rankings, traffic and other metrics that are important to you. Referring to placing nofollow tags, this won't make any difference in this case. Keep in mind that nofollow tags are there to tell google not to look what's there on the links; to not crawl (follow) them. Here more information from Google about nofollow attribute. Hope it helps. Best luck. Gaston

    | GastonRiera
    2

  • I have seen things like this happen before, but they're usually associated with a links penalty rather than a rich snippet spam penalty. When Google remove the authority pipelines for bad links, they don't magically decide to start valuing those linking sites again due to a reconsideration request (so in that area, it's common for people to get into an awful mess of unrealistic expectations) With rich snippet spam penalties, I have seen some pretty savage ones but usually they are more of an on / off scenario. To see the kind of continual decline which you say you are experiencing, is quite unusual Technical factors can influence ranking results, but they tend to influence indexation more than they influence rankings (e.g: making URLs which were previously hard to discover, easier for Google to discover, so new ranking positions can be created). Technical changes are (usually, there are exceptions) less good at pushing up existing rankings (which is more the domain of content, awesomeness and link-worthiness) "- Redirecting duplicate versions, fixing redirects on internal links" Something that can be done with the best of intentions, yet which can often be done wrong. For example, maybe you own a site and you notice that both of these URLs are accessible (200/OK): https://www.mysite.com https://www.mysite.com/ One has a trailing slash, the other does not. So you say to yourself, okay what we'll do is redirect one structure to the other! Seems logical right? But what if one of your structures (non-trailling slash) was more commonly linked to than the other (forced trailing slash)? When you make your change, suddenly most of your most important backlinks are hitting 301 redirects, instead of hitting your landing pages directly. In this hypothetical example, if you had picked the alternate structure (removing the trailing slash from URLs instead of forcing it) then the site may have performed much better. This is just a hypothetical illustration, but it shows that - simple ideas are never simple! In SEO we get paid for our analytical skills because they do matter and people need analysis pieces before making sweeping decisions, without realising the potential ramifications "There's also work on-page running in the background fixing up keyword cannibalization, consolidating content keyword mapping and ensuring the internal link structure is sound" Again, you may be shooting yourself in the foot in the short term. I am referring to what you term as "consolidating content" which usually revolves in reducing the number of pages on your website and funneling some content together, into fewer, more in-depth URLs which you hope will rank better. Totally the right thing to do in the long term, but in SEO, many strategies which yield long-term gains also cause disruption which causes short-term tail-off. If you JUST pulled yourself out of a penalty, was it really the right time to 'get disruptive'? I'd say no, it was not If you are consolidating content, Google may or may not rank your single new page as well (for different keywords) as the two or more pages which were funneled into the creation of your new page. Why? Well, from a technical POV, even when you deploy the mighty 301 redirect, it doesn't always transfer 100% SEO authority from the old URL(s) to the new URL Google tend to run similarity checks over their last active cache of the old URL(s), against the new page which you have supplied. If they seem % dissimilar, then that % of SEO authority is removed from the equity transfer of the 301 redirect. By similar, I mean something akin to, taking all the content from both (old vs new) page variants and running something like a simplified Boolean string similarity test. I don't mean what humans think is similar, I don't mean what you think is similar. I mean - what a mechanical mind would think was similar / dissimilar (often very different) If Google didn't run such checks, you could easily by up authoritative expired domains, redirect them to yourself and gain loads of SEO authority for nothing. So Google wants to be sure, is THIS content which is receiving this 301 redirect - the SAME content which earned those backlinks? Might the webmaster who linked to that old URL, decide not to link to this new page? If there's much risk of that, even the mighty 301 redirect gets nuked in terms of equity transfer Your hope of course, is that your new URL will be so much better than the old one(s), that over time it will earn more links than they did. If you are lucky, some authority from the old page(s) will filter through, but you should certainly expect some degree of short-term tail-off. If you have done this just as you have escaped a penalty, I can see how the convergence of your technical disruption(s) and the late penalty, could be causing you significant issues Instead of doing types of work which remove URLs from your site, remove pages which could be indexing and narrow your content - I'd be doing EXACTLY the opposite. Creating new pages and content which is connected with new (yet relevant) keywords. Maybe work on the top or middle of your keyword (buying) funnel a bit. Get some digital (editorial) PR going, get some more authority and new pages which could be ranking in Google's SERPs. If you think about it, performing purely reductive work after you have had a massive traffic reduction, really isn't going to serve you very well Hope that helps

    | effectdigital
    0

  • You can make product pages more robust by including an FAQ section or writing a more detailed description. If you are reselling a product, it can also help to include the product # or manufacturers id on the page as well.

    | pilesofpillows
    1

  • Not that I know of unfortunately. If they were local business listings you could use Moz Local, Yext or BrightLocal, but it doesn't look like that is the case. I would just let the client know it may take some time, but with continued link building efforts Google will eventually start to drop Vevox from the search results for MeeToo.

    | pilesofpillows
    3

  • I believe there is a way to do a "Find & Replace" function within Wordpress or your HT Access. I would highly recommend working with an experienced website developer for this. If you have any additional Schema in the of your website, updating it within the should update it sitewide automatically.

    | pilesofpillows
    2

  • Hello Matt, As mentioned, it will take time for Google to crawl the old site's URLs, de-index them and transfer the rank to the new site. That process should take a few months to complete, so all you can do is wait. Trying to manually request crawling through the search console can speed up the process a little but in the end, it will still take its time. As to why the new articles aren't ranking, the issue should be looked over individually to your site, as there are many factors that contribute to a post ranking higher or lower in the SERPs. Feel free to contact me if you need an analysis and possible solutions for the issue. Daniel Rika - Dalerio Consulting https://dalerioconsulting.com/ info@dalerioconsulting.com

    | Dalerio-Consulting
    1

  • Hi Martijn, The solution didn't work, I'm not sure if there is a conflict here but this is what my htaccess currently looks like: RewriteEngine On RewriteCond %{HTTPS} off RewriteRule ^(.*)$ https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301] #RewriteCond %{HTTPS} !=on #RewriteRule ^(.)$ https://thespacecollective.com/$1 [L,R=301] #RewriteRule ^(.)$ https://www.thespacecollective.com/$1 [L,R=301] <ifmodule pagespeed_module="">ModPagespeed off</ifmodule> RewriteRule .* - [E=noabort:1] RewriteRule .* - [E=noconntimeout:1] <ifmodule mod_rewrite.c="">RewriteEngine On RewriteCond %{HTTP_HOST} ^interstellarstore.com$ [OR] RewriteCond %{HTTP_HOST} ^www.interstellarstore.com$</ifmodule> RewriteRule (.*)$ https://www.thespacecollective.com/$1 [R=301,L]

    | moon-boots
    0

  • Even if it's a static site you should still be able to utilise a .htaccess file, NginX redirects or web.config depending on your server configuration - to redirect certain URLs to other addresses. Just because a site is static, that doesn't mean that .htaccess (Apache) or web.config (IIS) will not intercept and rewrite https://www.inmotionhosting.com/support/website/redirects/setting-up-a-301-permanent-redirect-via-htaccess Just stick a '.htaccess' (extension only, no filename) in your site's root and edit to suit. Most apache sites will adhere to a .htaccess file, placed in the root. Be sure to use the 301 (permanent) redirect status

    | effectdigital
    0