Category: Technical SEO Issues
Discuss site health, structure, and other technical SEO issues.
-
Redirect and ranking issue
I see this as good news. Sites break and pages go missing (according to Google's crawl) from time to time. I was concerned at first that permanent redirects were added by mistake. I would give it some time and default to Google's usual advice for fixing 404s - which is to just fix the error and monitor in search console.
| Eric.W.Caudle0 -
My site ranking has dropped in recent 2 weeks ?
Although your site appears to load fine, Pingdom reported a load time of 6.41S. You have a number of redirect chains, scripts that could be combined, large request size, etc. Most of these things are easy to fix. There are many services out there where you can get a decent report. The report will give you a good list of fixes to make. I use https://tools.pingdom.com I've seen page load time (especially after recent site changes) adversely affect site ranking more than any other single cause.
| Eric.W.Caudle0 -
Removing Multiple 301 Redirects
Hi Robert! The safest route here seems to be just adding an extra 301 to the https version. This will gather all the links out there. If you remove some of the middle links and that just happen to be a link to your site you'll be losing a possible good link. Google understands well up to 4-5 hops. So I think that you'll be ok with just these 3 hops. To back what i'm saying: Matt Cutts said it in these videos: Can too many redirects from a single URL have a negative effect on crawling? Is there a limit to how many 301 (Permanent) redirects I can do on a site? Hope it helps. Best luck. GR.
| GastonRiera0 -
Ranking penalty for "accordion" content -- hidden prior to user interaction
Although I have no active accordions at the moment, I do have bootstrap tabs and all of the hidden content within the tabs is crawled. To test your own page, go to webmaster tools and Fetch as Google. When fetching is done, click on the page link and select the tab "Fetching". This shows all of the HTML crawled and you can verify that the content in the accordion has been crawled. You also mentioned "penalty". Google's policy on penalties for hidden content is here: https://support.google.com/webmasters/answer/66353?hl=en. Google would not consider tabs and accordions containing legitimate content (no white text etc.) to fall under this "Hidden text and links" category.
| GlennFerrell1 -
Facebook Pixel Integration
Hi Slumberjac It's embedded in a piece of Javascript which pulls in an image to your site from theirs. Just ignore it - it has no effect on SEO whatsoever. the only time it could affect is if it loaded slowly. I would not play about with it in an attempt to no follow. Regards Nigel
| Nigel_Carr0 -
Multiple robots.txt files on server
So what's the best policy if a site uses an e-commerce platform like Magento, which has a robots file, but also has a Wordpress blog installed to another folder. eg: /blog and uses a plugin like YOAST which generated a robots file of the Wordpress installation. Then you have 2 robots files, is this detrimental or no big deal?
| peakdistrictseo0 -
Pages Fighting Over Keywords
Hi Neil, Thanks for your reply. So, the thing is, it's typically much, much harder to rank national pages (you're competing against the nation) than to rank local ones - if the searcher is local and the search is perceived to have a local intent, because these pages are only competing locally. So, another question from me: Are the local pages ranking well for local searchers? As in, your office in Atlanta is telling you they see the Atlanta landing page come up instead of the national page? And your office in Dallas is telling you they see the Dallas page come up instead of the national page? Or, are you saying, you have a searcher in San Francisco (where you have no office) seeing the Dallas page instead of the national page?
| MiriamEllis0 -
301 Domain Redirect And Old Domain to a New one including pages
Hey David, I apologize, help me understand. Your problem is that your new website's homepage is not at the root (newdomain.co.uk) – and redirecting all URLs at the top level will not work because of this?
| brooksmanley0 -
Bulk URL Removal in Webmaster Tools
Hi Michael, Is a .htaccess an option? Mention the URL's by hand or with wildcards and give an 410-header code to make sure search engines know these pages are really gone. These links might get you started: http://stackoverflow.com/questions/33247849/using-htaccess-to-410-any-wildcard-url-that-contains-a-question-mark http://www.quickonlinetips.com/archives/2014/11/http-410-error-pages-htaccess/ Hope that helps. Bas
| BasKierkels0 -
Old url is still indexed
Hello, I agree with Agencia SEO, The 301 redirect should help take care of the problem. It does take a little time for it to kick in, but it will help with all the search engines and not just Google. Best Regards
| Dalessi0 -
301 Re-direct help
Not sure if this is the way it should exactly be handled, but here are some thoughts for consideration: 1. I'd consider pointing IP address of domain-A to IP address of domain-B, then set up a general 301 domain redirect for domain-A -> domain-B and then do 301 redirects per URL from domain-A to domain-B as applicable/relevant. The redirects would be setup to be managed on domain-b, as domain-a would be pointing to domain-b's server. 2. Since you're effectively "changing site address" here, so you'd want to ensure you have both sites and all versions setup in GSC (domain-a.com, www.domain-a.com, https://domain-a.com, https://www.domain-a.com domain-b.com, www.domain-b.com, etc., etc.) and then initiate a change of address for domain-a.com to go to domain-b.com. Cheers
| regal_kyle0 -
CSS user select and any potential affect on SEO
Nope, no effect, as you suspect, Eddie. That kind of attempt at copy-blocking doesn't change the way the content is available in the DOM of the page (which is why it's so ineffective) so has no effect on crawling/indexing. You can prove this for yourself by going to a page and right-clicking to select the browser Inspect mode. This mode shows the actual rendered DOM of the page that the search engines are reading, and you'll see the content is easily accessible. The other option is to do a Fetch and Render request from within the site's Google Search Console and it will also show you what content Google can see. That what you were looking for? Paul
| ThompsonPaul0 -
US and UK Websites of Same Business with Same Content
Yup, but doesn't matter. Hreflang works for this situation whether cross-domain or on a subdirectory/subdomain basis (and in fact is even more effective when cross-domain as you're also getting the benefit of the geo-located ccTLD.) P.
| ThompsonPaul1 -
One company, 3 countries, 3 sites - best solution?
Gaston, your answer is correct (albeit not complete... see my answer below). Plus: when linking to posts and guides, even if they are by Moz, always explain why they can be useful.
| gfiorelli10 -
Wrong titles for site links of my website.
If I search normally, I can't replicate this, but when I run the following query: site:find.exchange "about about" ...I'm seeing a page with that "ABOUT ABOUT about" title, even though a normal search shows the correct display title (for the same exact URL). Looks like the cached version also has the correct title tag (?) Not sure what SVG Tim is referring to, but it's worth checking out. Google is clearly picking up some odd cue somewhere, but this is definitely an unusual case and I'm not seeing it in the text anywhere on this page or even on a duplicate page on your site.
| Dr-Pete0 -
Desktop & Mobile XML Sitemap Submitted But Only Desktop Sitemap Indexed On Google Search Console
Hi Allison, any updates on this? From my understanding, it is possible that Google is not indexing the mobile versions of pages if they are simply corresponding to the desktop pages (and indicated as such with the rel=alternate mobile switchboard tags). If they have that information they may simply index the desktop pages and then display the mobile URL in search results. It is also possible that the GSC data is not accurate - if you do a 'site:' search for your mobile pages (I would try something like 'site:domain/m/' and see what shows up), does it show a higher number of mobile pages than what you're seeing in GSC? Can you check data for your mobile rankings and see what URLs are being shown for mobile searchers? If your data is showing that mobile users are landing on these pages from search, this would indicate that they are being shown in search results, even if they're not showing up as "indexed" in GSC.
| bridget.randolph0 -
How can I make sure a desktoppage is shown in the (desktop) search results instead of the mobile page?
Hi Willie, Is your homepage showing up for other keywords? Are you sure it's been indexed? Make sure you haven't accidentally blocked the homepage from crawlers. Even if it is in the index already, now that you've made these updates to the homepage you may want to resubmit it to the index in Google Search Console using the Fetch and Render tool. Also remember that these things can take a little time to update.
| bridget.randolph0 -
Rankings are different in Different Geographical are with in the same country
Hello, Rankings could be also different for two people in the same region/city/etc since Google SERP's (rankings) are always fluctuating, specially for lower positions. And yes, geographical factor is even more important since some results are geo-oriented and rank better in a specific region/city etc. You are not mising anything my friend I recommend you to read this Moz post: https://moz.com/local-search-ranking-factors Roberto
| AgenciaSEO.eu0