Category: Technical SEO Issues
Discuss site health, structure, and other technical SEO issues.
-
Fixing a website redirect situation that resulted in drop in traffic
Hey Carolina, Jarno has nailed it. Just on a side note, if you want to see the traffic for http://domain.com your web developer should be able to supply you with the server logs which will tell you similar stats to regular analytics. This is achievable because log files record requests to a server. Google Analytics record requests for pages with the Analytics code installed. Hope you can sort it Iain - Reload Media
| IainReloadMedia0 -
Content is king, is it okay if its in a widget?
Hey Greenhornet, This depends on how the widget is made. If it's none accessible by Search Engines, i.e. it's made in Flash, then Search Engine can only read limited information from them. So in the Flash case, it's not of any really benefit. If the widget is accessible to Search Engines, then better for the home page. However if it's on every page, you'll have issues with duplicate content. Therefore just have the crawlable widget on the home page, and start developing content for all your other pages. Hope that helps, and good luck. Iain - Reload Media
| IainReloadMedia0 -
Rankings of Subdomains vs. Main Domain
Maybe the robots problem fixes everything for you. I hope it does. I always use the same kind of robots.txt -> as short as possible and as clear as possible. Archives and everything are allowed and so far (knocking on a piece of natural wood) never has something like this occurred. Again hope this helps your site kind regards Jarno
| JarnoNijzing0 -
Making one landing page rank higher than another
Brad has hit the nail on the head. Very clear, concise advice on what to do!
| MiriamEllis0 -
Is there a suggested limit to the amount of links on a sitemap?
Your html sitemap is best for website visitors, so best practice is to list the most important sections/pages. Google can use your html sitemap page to crawl the rest of your site as long as the structure can be followed. If you have lots of pages, then it's best to us an xml sitemap to submit through Google Webmaster. Once your xml sitemap is in the root directory of your website, you can also let search engines know its location through your robots.txt file like this: User-agent: * Sitemap: http://www.SomeDomain.com/sitemap.xml If your site changes over time, it's a good idea to create fresh sitemaps - just set reminders for yourself in a calendar.
| Prospector-Plastics0 -
Google local listings
Hello Vladislav, Interesting question. Now, 'gutter ma' isn't exactly what I would call a local search. Local searches are typically city-based. So, for you, if you're located in Springfield, true local searches would be for things like 'gutter cleaning springfield' or searches without the 'springfield' but being performed on Springfield-based devices. So, when we're looking at the more random results for a statewide search something like 'gutter ma' (which probably is not how most people would be searching for your services) we see a variety of locales represented. Setting my location to Springfield, this is what I see: Abbott Window & Gutter Cleaning, Inc. <cite>www.abbottwindowandguttercleaning.com/</cite> 1 Google review 2 Kane Industrial Drive Hudson, MA (978) 562-1744 | Minuteman Leafguard <cite>www.leafguardgutters.com/boston-ma/hudson</cite> Google+ page 50 Boston Road Lowell, MA (978) 256-0092 | Insul-Kings <cite>www.insulkings.com/</cite> 1 Google review 66 Farm Road Marlborough, MA (508) 481-0666 | Marlboro Gutter Cleaning <cite>www.cleangutter.com/</cite> Google+ page Plymouth, MA (800) 994-3000 Gutter Pro <cite>www.gutterpro.com/</cite> 2 Google reviews 60 Medford Street Somerville, MA (617) 868-2673 | Carroll Seamless Gutters Inc <cite>www.carrollsons.com/</cite> Google+ page 16 Jacques Street Worcester, MA (508) 488-9999 | Custom Insulation Company, Inc. <cite>www.custominsulation.com/</cite> 2 Google reviews - Google+ pageMore results near Massachusetts » As you can see from this search, Google is selecting businesses from many different cities within the state. What their algo is for doing so on a broad search like this, I do not know. But, at least on my search, it doesn't appear that they are simply putting up listing from the state capitol (Boston). The selection looks quite random to me and I have no idea how one could influence it because of the apparent randomness. Also, I would truly question the value of ranking in a local pack like this for a state based, unspecific search. Presumably, people are looking for your client for 'gutter cleaning' not just 'gutter' and if they are leaving off any geo terms at all, Google will show them local results based on their location, and if they are adding a city term, it would presumably be their own, in which case, your client has the best chances of ranking in the local pack of results for their city of location. Hope this helps!
| MiriamEllis0 -
Proper method of consolidating https to http?
Hi there, I would agree with your developer in using 301 redirects to ensure all static pages resolve only to the HTTP version while the secure pages resolve only to HTTPS. As for SEO, the search engines should follow these 301 redirects just fine, but it might also be a good idea to designate canonical URLs to tell the search engines only to index non-HTTPS pages just to be safe. The PHP code below automatically detects which version of the page is being accessed and automatically inserts a canonical tag to tell the search engines to only index the non-HTTPS versions. $currenturl= $_SERVER['HTTP_HOST'].$_SERVER['REQUEST_URI']; //Check if it is using the secure https port which is 443 if ($_SERVER["SERVER_PORT"] == “443″) { //connected to secure port, formulate the http canonical version $canonicalversion=”http://”.$currenturl; //echo the canonical version to the HTML as link rel canonical tag echo ‘’; } ?>
| StreamlineMetrics0 -
Site offline - Mitigating measures?
Oh that's a bummer! Unfortunately since you don't own the domain right now there's not a ton to be done. Fortunately, 48 hours isn't SO bad. Google will re-crawl you when you re-launch (keep a close eye to make sure they do so correctly and all your pages get indexed), and you'll probably see a bit of a dip in traffic/rankings for a bit, but you should recover within a month or two. I'd spend this time thinking up some ways to quickly earn some new links - maybe a new piece of content, or building some new relationships? That will mitigate the effects from this. Good luck!
| RuthBurrReedy0 -
Does google like Category pages or pages with lots of Products on them?
We are currently having meetings about the bigger issues, but I need examples like this (individual pages) to explain the reasoning why we need to make changes. This is a very big site and it will take some time to get the needed changes done. Also they asked me directly about these pages. I just enjoy getting other peoples feed back, so thank you.
| DoRM0 -
Multilingual blogs and site structure
Thanks for the clarification. I can't really comment intelligently without knowing more about the nature of the site and the intended audience. But may I gently suggest you might want to think about the issue more broadly than as a technical SEO issue? For instance, why do you assume that English readers wouldn't be interested in Japanese holidays when you also say many readers are bilingual? If you made an editorial decision to publish everything in both languages, the SEO piece would look very different. I always advocate settling the editorial issues first, then plugging in the SEO component.
| DanielFreedman0 -
What's the latest on Title Tags?
CTR is a ranking factor in Adwords, but I haven't heard of it being a factor in SEO. I do see they somewhat track that in webmaster tools.
| netviper0 -
How do I Address Low Quality/Duplicate Content Issue for a Job portal?
Unique Listing Copy I would try to get that unique content to the top of the source order - it doesn't necessarily have to appear at the top of the page - it could be in a sidebar for instance, but it should be first in the source so that Googlebot gobbles it up before it reaches duplicate stuff or secondary nav / footer links etc. No Results pages Yes, you could certainly noindex your no-results pages with a robots meta tag - that would be a good idea. Loading duplicate content with ajax In terms of Google and ajax content, yes Googlebot can and does follow links it finds in javascript. All I can tell you here is my own experience On my product detail template, I have loaded up category descriptions with ajax that appear canonically (if that's the right way of putting it) on my listing pages. In the SERPs, the cat description content is indexed for the pages I want it to be indexed for (the listings in this case), and not for the product detail pages where I'm loading with ajax. And these product detail pages still perform well and get good organic landing traffic. On the product detail page where I'm loading with ajax, I have the copy in an accordion, and it's loaded with an ajax request on document ready. It might be considered slightly more cochre to do this in response to a user action though - such as clicking on the accordion in my case. The theory being that you're making your site responsive to a user's needs, rather than loading up half the content one way and the other half another way, if you get what I mean. Sometimes of course you just cannot avoid certain amounts of content duplication within your site.
| LukeHardiman3 -
Are pagination a bad thing for seo
thank you for this. been reading this a few times but i am still puzzled on how to do it. i use joomla and now sure how i can impliment this, any more advice would be great
| ClaireH-1848860