Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • It might be that your updated title tag language is decreasing clicks through to your site - so you're still ranking the same, but not as many people are clicking. Since you're wanting to target a more qualified audience, this dip in traffic might not be a bad thing if you're seeing a higher conversion rate or more qualified leads. If you're running AdWords, it might be worth testing both versions of the language from your title tags in ads to see if that is impacting your click-through rate.

    | RuthBurrReedy
    0

  • Thanks for sharing your experience! I'll look for the blog post on this topic for sure.

    | kirupa
    0

  • Keep in mind that for a site:domain.com search, Google now includes pages from OTHER SITES that are using the canonical tag to point to your site. So, even though it says there are 300 pages indexed, 30 of those pages might be on other sites that use the canonical tag pointing to your site. The numbers of pages indexed that you're looking at may not be entirely accurate because of this.

    | GlobeRunner
    2

  • I'm assuming you include these tags for Facebook's benefit, because if you want to send localization signals to Google, it's better to use hreflang. Hreflang allows a comprehensive range of countries and languages: you can combine any two values from the ISO-639-1 list of language codes and the ISO-3166-1 Alpha 2 list of countries. The locales supported by Facebook are a lot more limited. You can't have en-IT for English speakers in Italy, for example, because Facebook doesn't support that value. Facebook only allows the following English-language locales: UK English: en-GB US English: en-US English (India): en-IN English (Pirate): en-PI English (Upside Down): en_UD You can see Facebook's full list of supported locales on this page. So the answer is that it won't matter where your users are. What will matter to Facebook is the style of English you use. If you write in British English, you should use en-GB, and if you use US spellings then use en-US.

    | StephanSolomonidis
    0

  • Hi there. Your link doesn't work. Anyway. Youtube video are embedded iframes, and as we know, iframes are considered pretty much as a part of another website, inserted into your website's page. I assume that price thingy is the same. So, no matter if you lazy load it or not, iframe will not be considered as a part of YOUR content, especially not as unique content for sure. So, make sure that you have plenty of another content on those pages - text, images, whatever. Otherwise even from the user experience perspective it would be not the best page to be on. Hope this helps.

    | DmitriiK
    0

  • Maureen, I wanted to be sure, so did a bit of research and there are a very limited set of markups using the highlighter. Here is the list that is similar to what you see in the dropdown on the highlighter. Given how long the highlighter has been available, my guess would be we won't see anything else that this works with. I am not sure why, but it has stayed like this for a long time. Best

    | RobertFisher
    1

  • We moved a lot of client sites over to HTTPs from HTTP. We always use the proper 301 Permanent Redirects, and there is no loss of rankings as far as we can tell. When it comes to links in particular, if you 301 redirect one page to another on the same domain name, you will NOT lose any link value. The key here is that it's a redirect on the same domain name. If you are moving from HTTP to HTTPs, make sure you verify the site in Google Search Console and update Google Analytics, as you will see a loss of data if you don't update those.

    | GlobeRunner
    0

  • Based on what I understand this is a quite normal side affect of going to https:// and in many cases the benefits you may have gained from performing multiple page speed upgrades, responsive edits etc may be reduced in there effectiveness from a page speed perspective. However, and this is a big however, you will also benefit from an SEO stance from Google for using an encrypted protocol which is now considered a ranking factor, albeit a small one at press. Second to this, before too long many of the browsers will have fully adopted HTTP2 which is based on SPDY developed by Google and has apparently massive speed gains. In order to adopt HTTP2 you have to have already made the leap to using an SSL. HTTP2 allow you to load in parallel which will in turn give you much greater speeds going forward due to the request loading at the same time nd hence speeding things up even further.. Based on your current scores, I would imagine there are also a few more additional changes that you could make to further improve your sites speed whilst still using and SSL. CDN's, caching, compressing and more. Use sites like tools.pingdom, gtmetrix, webpagetest.org to see what else can be improved. Hope this offers some further insight.

    | TimHolmes
    0

  • Hello! Small sites generally won't stay in our index if we don't find NEW links to a domain. Our cache will remove a link after 190 days unless re-crawled. https://moz.com/researchtools/ose/just-discovered?site=moz.com&filter=&source=&target=page&page=1&sort=crawled I recommend checking Just Discovered to see if your domain is in queue for re-indexing from new discovered links. If not, you will want to keep link building on sites that have high authority and are likely able to be picked up.. There are some insight provided here that will help: https://moz.com/community/q/da-pa-fluctuations-how-to-interpret-apply-understand-these-ml-based-scores Cheers!

    | DavidLee
    0

  • When it comes to hosting images off site, keep in mind that there may be connectivity and image load speed issues associated with these images. I typically wouldn't rely on an image being pulled from someone else's site, even if it was something like Flickr or Photobucket. I don't think those particular services intend for you to use their service like that. Rather, we typically will set up a CDN for hosting the images or PDFs, and use a subdomain of the main site in order to host the images. For example, we would set up images.domain.com and pdf.domain.com and put the images and PDFs on the CDN.

    | GlobeRunner
    0

  • Duplicate title tags are generally not something that you want--but in some cases they are necessary. If the content on the page is different, then it is not going to be a disaster for the site. I would make sure that Yoast is enabled (or use some other way of taking care of your site's meta data, etc.). If you do have a duplicate title tag issue, though, keep in mind that there may be other issues associated with that. For example, you may need to use the canonical tag to take care of pages that are duplicates (thus the duplicate title tags) or use the robots.txt file to disallow indexing of certain sections of your site.

    | GlobeRunner
    0

  • As per Dirk and Logan, personally I would try and not have two sets of content to be delivered, the point of responsive is really to have one page that is simply delivered visually consistent on all device types. With regards to it being considered as duplicate content this is a little harder to determine and I would imagine Google may not penalise you for it really unless it is really spammy and used for keyword stuffing/cloaking etc. Here is an old video from M. Cutts, although it may not be totally relevant in today's SEO landscape and is probably more geared towards duplicate pages.

    | TimHolmes
    0

  • Whenever you move, here's what we typically recommend: crawl site, make note of URLs set up redirect of old URLs to new URLs test to make sure they're a proper 301 redirect verify all versions of the site in Google Search Console (old site and new site, http and http://www versions as well as https versions) Use the Google Change of Address Tool

    | GlobeRunner
    0

  • Hello Robert, Hope you're doing well, Seems I just didn't include all the info I needed, making posts as if you all have access to the data I have in my head is a mistake What I meant is I have 500 urls, after doing a full link audit, 153 of those urls are trashy directory or comment spam backlinks with very spammy site templates, also they often times not had anchor text with main keywords of ours. Now yes, 153 out of 500 urls isn't that bad, however the 153 urls are more than 3 years old, and our total backlinks has grown from 160 ( a year ago ) to 500 ( today ) meaning at one point in time, we did have a majority of backlinks that were from trashy directories and comment spam, in fact last year ( 160 - 153 left us with 7 possibly ok backlinks )  was the most recent, so it's a very high possibility these urls were the reason we got hit by penguin ( have data that suggest this as well ). It's easy to get fixated on one SEO keypoint or another, but when you do, just remember to follow the data trail and make sure to see the other SEO footprints as well during that search, you can find a good bit. Even answering questions from others here helps me find further keypoints to highlight and go through.

    | Deacyde
    0

  • Great follow up!  Thanks for that. :^)

    | RyanPurkey
    0

  • Hi Aleyda, Thanks so much for your in-depth answer! You have confirmed what I suspected is the case. I've been working with dev to try and get this issue fixed, and hopefully it will be soon! Thank you again. Daniel

    | DanielKiely6
    0

  • When you launched the new version of the site, did you set up the 301 Permanent Redirects so that all of the old pages are mapped/redirected to the most appropriate page on the new version of the site? It sounds so me like you might not have set up the 301 redirects properly. One thing you can do, though, is look at your Google Search Console crawl errors and pay attention to the most important ones--and fix those first.

    | GlobeRunner
    1

  • I agree with the responses here: having two sites is generally not the way forward. I just want to add one other option. Most people who received link-based penalties were building links to 2 or 3 pages. If that's the case you could 301 redirect most of the site on a page level and let the pages with bad links 404. If you have cleared the penalty it might not even be an issue to 301 redirect the whole domain. Just make sure that you update your disavow on the new domain.

    | Carson-Ward
    1

  • I've set up a new redesign of our page to reduce the number of links per item from 4 to 1.  We are also going to reduce the number of items from 30. The challenge here is that in some cases, there really are more than 30 items that are relevant (e.g. not related, but actually part of the list as they should be).  We've got some thinking to do as to whether or not we do pagination or incremental load.

    | APFM
    0