Category: Technical SEO Issues
Discuss site health, structure, and other technical SEO issues.
-
Over 500 thin URLs indexed from dynamically created pages (for lightboxes)
This was an issue that yoast came up with on an upgrade with wordpress in the middle of 2018. It may be worth a little research into how their "purge" plugin worked as it did exactly this. Using the htaccess file simply tell google not to index the resource pages then they will naturally over time fall out of the search or you can purge by Log into the Google Search Console and select the desired website. Click on “Optimization” in the left-hand navigation. Click on “Remove URL” in the sub-menu. Click on the button “create a new request for removal” on this page. Once this is done and they are set to no index. Problem solved.
| Libra_Photographic0 -
Recovering from a Google penalty
The erroneous advice you got was that Google handle reconsideration requests within a few days. Usually it does take a few weeks and in extreme situations it can take months. Usually when it takes months, it's because you have repeat-offended over the same issue. If you get a penalty for link-spam and then do more link-spam, each time you submit to be reconsidered they leave it longer and longer It's also down to Google's internal resources. The sad fact of the matter is that, although losing Google does heavy damage to your site... Losing your site, doesn't do heavy damage to Google. If there are other matters which Google are pulling focus to internally, it can take quite some time to 'be seen' as it were It would be strange of Google to mis-apply a penalty of some kind. Usually if you have hacked content, either those pages get nerfed or your whole site gets nerfed. Having one page get nerfed which was not part of the assault, is extremely unusual I know that one thing you can do, is to create a free account to query Google's 'safe browsing' API https://developers.google.com/safe-browsing/ My next step would be, to ascertain the URL of every page on your site that exists now or has existed within the past 12 months (just to be sure). You can get historic URLs out of the Wayback Machine, Google Analytics (by making a table that combines host-name and page / landing page - unfortunately they won't give you protocol... so hope that hasn't changed for you in the past year!) or Google Search Console. The live URLs, just crawl with Screaming Frog or similar. Once you have a complete list, get a developer to build a rough script that will query all your URLs against Google's safe-browsing API. That would tell you if Google still sees a problem. It will tell you where Google sees the problem, if it does indeed see a problem in this area (and whether it sees the problem on live or dead pages) When you have that to hand, you'll be in a much better position to know whether you still have an issue or whether you just haven't been seen by a Google rep yet. When they decline a reconsideration request, usually they do tell you I think that the safe-browsing API, whilst free to access - is limited to 10k queries per day. Don't try and get clever and get around it, if you are already having Google problems (you don't want MORE!) Another thing, it never hurts to link to an 'open' (link access required only) GoogleDoc (their version of word) within your reconsideration note. In the GoogleDoc you can much more fully explain, even with screenshots - what the heck is goin' on! You need more information right now. Sorry I'm not giving you an instant solution, but I am telling you exactly what I'd do in your position
| effectdigital0 -
Spam pages being redirected to 404s but sill indexed
In addition to the above, you can request removal from Google's index in Search Console https://support.google.com/webmasters/answer/1663419?hl=en As noted, the removal is temporary (90 days), but if you've removed the pages and any links to them, then they won't reappear. What I would do is just check that your sitemap is up to date, and there aren't any legacy sitemaps hanging about that might still reference the pages, and also run a crawl of your site to ensure that there aren't any remaining links to these pages hanging about.
| Xiano1 -
My site have a issue
Hi saharali15, Welcome to the Q&A forum! What issue are you trying to solve with this site, exactly? The more specific you can be, the more likely you will get a helpful response to your question. Cheers! Christy
| Christy-Correll1 -
Wordpress Redirect Loop on domain name
Maybe check your apache httpd.conf file?
| OlegKorneitchouk0 -
Any idea why ?ref=wookmark being appended to URL?
Hi Martijn, I checked my FIrefox, and not seeing any plugins or addons that would cause that issue. It is strange that it is only that one page. None of my other similar pages do that... Did it happen on your computer? Any other ideas? Greg
| GregB1230 -
If some rooted domains providing back links to a website are from the same server, would it cause an issue?
Hello SeobyKP, That's a tough question. It's fairly easy for Google to see the other domains on any given server. It is not uncommon for shared hosting to have sites in all sorts of different industries, and all levels of quality. My thought is that you should probably put your client's site on a different server, especially if you're going to be linking into it from other domains on that server. I would personally not link to your client's sites from the other ones because they all appear to use the exact same template, and do not seem to be of very high quality. See these: https://viewdns.info/reverseip/?host=alliedautotransport.com&t=1 Either way, try to avoid using keyword-rich anchor text. It is perfectly fine to interlink a network of sites. Many eCommerce brands do this. They tend to be linked to in the footer with the domain or brand name only.
| Everett1 -
HREFLANG: language and geography without general language
No matter what directives you (or your devs) insert on your pages and sitemaps, Google may still ignore them if it finds another resource it "thinks" is better targeted to a given user's region and language. With hreflang, you don't have to specify a region if you target numerous countries that all speak the same language. Keep in mind, however, that there are likely localized variants (different dialects, preferred spellings, cultural differences) that may require you to create customized content even if it's the same language. For instance, English speakers in the UK are likelier to write "prioritise" whereas English speakers in North America are likelier to write "prioritize." If you're targeting both of these regions with specialized services, then I'd recommend you tweak your spellings.
| zeehj0 -
Is there a great tool for URL mapping old to new web site?
If anyone looking for a similar tool, I think this tool will help. https://www.urlmapper.com/
| gltharindu0 -
Canonical Page Question
Thanks for your advise.. That clears up the mis-conception for me since i would have thought the content actually stays the same other than the products shown. we're having some minor dramas with some parameters being indexed despite having parameters configured in search console. hence i was wondering if the canonical setup was something to do with it. I'm thinking i try and look into no-index follow system instead for paginated series.
| oceanstorm0 -
Google Showing wrong image in the SERPS
you need to be more specific as to where this image is showing up in serps (google shopping, featured snippet, mobile results, knowledge graph, shopping/news/amp results, image results in serp, schema cards such as recipes).
| OlegKorneitchouk0 -
Duplicate content issues... en-gb V en-us
You will want to set up multilingual hreflang tags across all pages. This will essentially tell crawlers "each of these URLs are actually the same 'page'. You should show the correct version based on the location of the visitor" which prevents them from treating it as duplicate content.
| OlegKorneitchouk0 -
Which better to rank with 40 DA domain redirect the domain 301
Hi Cristophare79, Hope you are well. which better to rank with 40 DA domain redirect the domain 301 to my website or host domain and create posts with my website link Its always better to build a brand and create useful and unique content. When you have an old domain with some authority, please you must check its backlinks and decide whether they suit the new site. You might need to disavow for a clean start. if I do the 301 redirect the Crawl Errors of old 40 da domain will display on my new website or not Those crawl errors won't appear. Also, you'll be giving Google mixed signals, as many pages from the old domain will be 404 and you'll be confusing Google and probably losing some backlinks how much links can i get from one website pbn There is no magic number. I'd strongly suggest not to use PBNs as Google is now smarter than ever and will find out your schemes, hence penalizing you. which better get links for home page or posts Again, there is no magic answer. There should be a mix and the most natural combination possible. If I'd be there, posts will be my choice. Always, and as a note to everyone reading this reply, you have to think on one hand, in what will help more you users answering their questions and on the other hand, that humans write Google's algorithms, they will find out if we are messing around with them. Hope it helps. Best wishes, GR
| GastonRiera0 -
Breadcrumbs with JSON-LD
I certainly don't think there's any harm in a self-referencing confirmation, if anything it gives a bit of a 'you are here' marker I found a Stack thread which might be handy: https://stackoverflow.com/questions/29816633/breadcrumbs-on-current-page ... seems pretty relevant. Obviously they're using microdata and not JSON-LD, but the logic should be similar
| effectdigital0 -
Should I noindex pages on my website that are pulled from an API integration
Hi, I don't see any big problems with an API integration like this. There is a lot of companies that are using data from other content providers (through an API) that you are in most cases mostly looking at how you can enrich that content. That's why I would leave the pages indexed and make work of enriching the pages with as much other (more unique) content as possible. Martijn.
| Martijn_Scheijbeler0