Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • How are your links to these images coded? Sometimes Google has trouble understanding that something is an image. For example, OBJECT tags can be rendered by Google's renderer, but won't be treated as images/show up in image search. The same is true for images in an img srcset set, unles that srcset also has a value for "src= ". Are these images showing up in image search? What format are the images (jpg, png, svg, etc)? If the site is brand new, it could just be that Google is testing out different URLs for the SERP (pretty common with new sites) and will naturally filter out the less-useful ones over time. I can understand not wanting to just wait around on that, though.

    | RuthBurrReedy
    0

  • It's going to be difficult for a single-page website to do much in the way of organic traffic. You're correct that optimizing 80 sites for the same keywords and phrases isn't going to be very effective; these sites will likely be seen as duplicates of each other, especially if they share the same design and much of the same content. Even if Google doesn't de-index some of their pages for being duplicates, you will still be creating 80 websites that are all competing with each other for the same phrases, which will make it harder for any of them to rank. What makes these 80 companies different from each other? Why would someone choose one one over the other? If they're local businesses, I'd lean heavily into local SEO for this - using MozLocal to make sure each company's local citations are claimed, filled out, correct, and consistent; and making sure that each company's name, address, and phone number are prominent on the site itself.

    | RuthBurrReedy
    0

  • Thanks guys, some helpful tips !

    | Mat_C
    0

  • Oh, sorry. Somehow I didn't get any notification on your reply. For IIS you could go with web.config of your website. The code will be something like: <rule name="Force WWW and SSL" enabled="true" stopprocessing="true"><match url="(.*)"><conditions logicalgrouping="MatchAny"><add input="{HTTP_HOST}" pattern="^[^www]"><add input="{HTTPS}" pattern="off"></add></add></conditions> <action type="Redirect" url="https://www.domainname.com/{R:1}" appendquerystring="true" redirecttype="Permanent"></action></match></rule>

    | Keszi
    0

  • Yes I mentioned in my case I use Semrush and there is a dedicated space for that specific parameter. The easiest way to get your log files is logging into your cPanel and find an option called Raw Log Files. If you are still not able to find it, you may need to contact your hosting provider and ask them to provide the log files for your site. Raw Access Logs allow you to see what the visits to your website were without displaying graphs, charts, or other graphics. You can use the Raw Access Logs menu to download a zipped version of the server’s access log for your site. This can be very useful when you want to quickly see who has visited your site. Raw logs may only contain a few hours’ worths of data because they are discarded after the system processes them. However, if archiving is enabled, the system archives the raw log data before the system discards it. So go ahead and ensure that you are archiving! Once you have your log file ready to go, you now need to gather the other data set of pages that can be crawled by Google, using Screaming Frog. Crawl Your Pages with Screaming Frog SEO Spider Using the Screaming Frog SEO Spider, you can crawl your website as Googlebot would, and export a list of all the URLs that were found. Once you have Screaming Frog ready, first ensure that your crawl Mode is set to the default ‘Spider’. Then make sure that under Configuration > Spider, ‘Check External Links’ is unchecked, to avoid unnecessary external site crawling. Now you can type in your website URL, and click Start. Once the crawl is complete, simply a. Navigate to the Internal tab. b. Filter by HTML. c. Click Export. d. Save in .csv format. Now you should have two sets of URL data, both in .csv format: All you need to do now is compare the URL data from the two .csv files, and find the URLs that were not crawlable. If you decided to analyze a log file instead, you can use the Screaming Frog SEO Log File Analyser to uncover our orphan pages. (Keep in mind that Log File Analyzer is not the same tool that SEO spyder) The tool is very easy to use (download here), from the dashboard you have the ability to import the two data sets that you need to analyze If the answer were useful do not forget to mark it as a good answer ....Good Luck

    | Roman-Delcarmen
    1

  • Hi there, Thanks for the question. I'm afraid that I'm not too familiar with "Hoth" link building so I can't answer that directly. Regarding subdomains though, from a technical perspective, they are different websites and therefore are capable of passing link equity in the same way that any other website could. It really comes down to the quality and relevance of the subdomain. I'd recommend looking at them in the same way you would any other link and assess them for quality. The one thing to maybe flag is when you have lots of subdomains which are on the same root domain. There is a chance that these may not pass as much value but again, it comes down to why they are there and the quality of them. Hope that helps! Paddy

    | Paddy_Moogan
    0

  • Ha! This is a fun one Ok so you should know, that like many Meta tags - "Description" is just a directive. Google don't _have _to use your Meta description and if they think it's poorly written they will use whatever else they believe to be the top-line leading copy on the page (which is usually something from your body / paragraph content). So your Meta description reads like this: "FICO is an analytics company that is helping businesses make better decisions that drive higher levels of growth, profitability and customer satisfaction." - 154 characters, which is inside of the usually recommended 155 character limit. Obviously Google actually use pixel width, but this _should _be OK to be honest I really hope you don't take this the wrong way but at the open of the description, it reads like slightly broken English. It's actually not that bad, I have probably written one or two worse ones myself once or twice. Also - it's one huge sentence. I'd write it like this instead: "FICO is an analytics company that helps businesses make strategic decisions. Our insights drive higher levels of growth, profitability and satisfaction." - 152 characters My main issue was with the phrase "is helping businesses make better decisions" - doesn't read like real English (just calling it as I see it there, no offence meant!) Now let's get onto the **technical **side of it because this will really interest you! Your privacy notice, to a renderless (non-headless, JS-disabled, CSS disabled) crawler looks like the beginning of your content. Load up your web-page (https://www.fico.com/en). Download and install the "Web Developer" extension for Google Chrome, click the little cog-shaped icon that it adds to your navigation system. Click the "CSS" tab / menu, then click "Disable All Styles". If you followed my steps accurately, you'll see a page that looks like this: https://d.pr/2AOsys.png What looks most prominent there to you? What's under the main, title-like heading? Guess what - it's your privacy notice! What you should do is have the privacy notice coded at the **BOTTOM **of your source code, then use CSS to move it up 'visually'. That way Google won't get confused and think it's your main content opening up... :') Hope this helps. Love little ones like this

    | effectdigital
    0

  • Yes my Company MageAnts website pages seems search engine but not cached by google one more thing is ranking also going down so let me know if you have any solutions

    | vinaso773
    0

  • Hello Kingalan1, This issue with spammy links from Globe sites has been reported before, here:  The Globe - Spam Link Network. Should action be taken to remove these links? - Moz Q&A forum TL;DR; I wouldn't worry that much for those links as long as you find them spammy probably google will consider them as spam and don't consider them when ranking your page. Only if you consider that those links are hurting your rankings use the disavow tool. Long explanation: On one hand, Google (though their spokesman) said that the algorithm is pretty good finding and diminishing the strength of spammy links. So, if you talk to someone from Google right now, they might said: leave them, probably the algorithm will consider those as spammy and there will be no harm. On the other hand, If you are absolutely sure that those links are spammy and malicious... then go ahead and disavow them. Remember that it's possible to disavow an entire domain. More info here: Disavow backlinks - Search Console Help Hope it helps. Best luck GR

    | GastonRiera
    1

  • Take a look at this post on Moz - https://moz.com/blog/seo-changes-using-google-tag-manager. Also, take a look at https://productforums.google.com/forum/#!topic/tag-manager/Ilxks3tQKrg. Personally, I wouldn't recommend doing this via GTM. Noindex and Schema specially. They are both important for SEO and are better off done via the CMS.

    | NakulGoyal
    0

  • Hello, I have a Blog In Which I am Using Google Translate, Which Can Convert the Blog into Multilanguages. Is it Suggested to use Herflang Tags in My Blog. Thanks In Advance ( Blog I Have Mention is signature) Ashish Sharma Govtjobstyari.com

    | dalapayal
    0

  • Firsts your need to keep in mind that Google and Bing use a different algorithm and that means they use different evaluation methods to try to get the best results. I have been checking your site, based on my experience one of the accurate ways to measure the performance of a site is by checking the Trust flow  (basically measure how many of the sites on your neighborhood are pointing to your site)  so matter how many links you have or the DA what matters is how many of them belong to your niche. MOZ  - DA -17 Ahrefs - UR 25 Ahrefs  ----- DR 2.1 Majestic --- TF 1 As you will notice your domain rating and trust flow are too low. So basically you have links pointing to your site but they are not passing any value. So I will suggest you to start to work on building a link profile Regards

    | Roman-Delcarmen
    0

  • Just sent an email, no problem at all to post the issue publicly once resolved

    | ruislip18
    0

  • Try a crawl request inside google search console, it might get rectified if google recrawls your website.

    | vinaso96
    0

  • Istvan has given some great advice here - one other thing I would add about the gibberish URLs though: especially with the format of these, it's possible that the site may have been the victim of a hack. Make sure you get the site scanned for malware or other hacking activity and going forward have a security expert ensure that the site is secure (especially if it's using a platform like Wordpress which is vulnerable to hackers).

    | bridget.randolph
    0