Category: Technical SEO Issues
Discuss site health, structure, and other technical SEO issues.
-
Does Bitly hurt your SEO?
the bots can easily identify a url shortener, as this performs a normal 301 redirect
| zeepartner1 -
Ranking for multiple locations for the same service
Thanks Patrick. I think Google is making it more and more difficult to rank outside of your physical location. Google is smart and recognises that we don't have physical premises in the targeted locations outside of our hometown, so it displays results of local companies in both Google local places and organic results, as it sees the more local companies as a more relevant result and disregards our target location based keywords.
| Jseddon920 -
Would merging a site with strong DA with one that has weak DA be a smart move?
Since DA and PA are rather logarithmic, if there is a small difference in DA or PA between the sites then merging them will be a nice benefit. If there is a big difference in DA and PA the merger will probably not make the stronger site more competitive. An important thing to look at is the number of links that these two sites share. If they have very similar linking domains and linking pages then not much will be gained by the merger. If they have diverse and very different linking domains and linking pages that is when the most will be gained, based upon link metrics only. Another important thing to consider is the traffic producing assets. Does the site being redirected have unique, substantive, well-written and well-ranking articles. If it has lots and they are supreme quality then it will be a good asset gain for the site that receives them. Finally, will the site being 301ed contribute new products, new keyword reach, improved rankings. These are what might be improved on the weaker site if they are placed on the stronger site. Also, will the merger give current shoppers a greater selection of products, greater selection usually means larger average shopping carts.
| EGOL0 -
Using rel=canonical
Hi Andy, The single page will only serve the same results as the static pages only after user selects the fields in the drop-down. This is a functionality which will server results with users inputs only. Since rel=cannonical can be used only in the case of absolute duplication only like for instance e-commerce website with price, or other filters.
| glitterbug0 -
Structured Data verses Data Highlighter Tool
I agree what Oleg has suggested. You can use http://schema-creator.org/ for creating the code. Hope this helps! Umar
| UmarKhan0 -
Cities in Footer
Yeah we've done that for clients due to limited budget but we're working on the idea that I mentioned earlier on our site, which is currently undergoing a redesign, to see how Google reacts. The tricky part is content and avoiding duplicate information but we're setting it up so that the services page contains average but informative content. However each local services page is where we are putting the exceptional content and information with local testimonials and case studies which should make them unique. I haven't seen anyone take this approach so we're hopeful that it will yield positive results. I was curious if you had any experience with it and I really appreciate your feedback on this topic. Thank you!!
| GoogleDowner0 -
Affiliate Url & duplicate content
I'm seeing a lot of that in the SERPs with no particular pattern, even on BBC's site. Are you running Wordpress? Could it be a plugin you've added?
| BradsDeals0 -
No PA and no DA and no back links.
Hi there Craig makes a great point here - always have second sets of data to reference when it comes to backlinks and site performance. Crawlers from one vendor can only do so much, and other vendors sometimes pick up what others are not. Majestic and ahrefs are great, but also pay attention to Google Search Console, as those are the links that Google is seeing. If you want to give yourself an extra advantage, check out Kerboo - it scores your backlinks and also has a ton of great features to help you assess and maintain your profile. Hope this helps! Good luck!
| PatrickDelehanty0 -
Google Crawling Issues! How Can I Get Google to Crawl My Website Regularly?
Hi Farhan Yes sure, we've definitely found this helps bring the crawlers back to the site more regular! Hope it helps Dan
| libero_net1 -
How to know how much pages are indexed on Google?
Honestly speaking, I didn't try Scrapebox for this purpose but I heard that the results are not very accurate. I think you should go with URL Profiler if you're gonna need it on regular basis. Let's see what other suggests. Umar
| UmarKhan0 -
Solving duplicate content with WP authors, tags, categories
Thanks Sarah! An alternative resource might be my article on Moz about optimizing WordPress for SEO https://moz.com/blog/setup-wordpress-for-seo-success There are also some prior Q&A threads that address duplicate content in WordPress: https://moz.com/community/q/wordpress-category-archives-index-but-will-this-cause-duplication https://moz.com/community/q/utilising-wordpress-attachment-pages-without-getting-duplicate-content-wwarnings https://moz.com/community/q/moz-analytics-telling-me-i-have-duplicate-content-issues-how-to-fix-this Hope these help!
| evolvingSEO0 -
Fixing Mis-used Rel-Canonicals
You bet! If you don't mind, please mark this as a good answer then. Cheers!
| CleverPhD0 -
Google not using my meta description nor my meta title
Sorry if I missed this, but what search led to this result? Keep in mind that different queries will return different titles and snippets in SERPs. Google is taking more liberties than ever. It does seem like they're appending the brand "- Kbc", which is an increasingly common practice. Usually, it's one of two things: (1) They don't feel that the title/description are a good match to the particular search you're looking at. You could try to get other key phrases in the title/description, but often that ends up being overkill. It depends on how critical these searches are. (2) They feel that the title/description are too long or too marketing-heavy. This can be tougher to pin down. I'll be honest - this is happening more frequently, and there's no way currently to stop Google from doing rewrites. You can try to tweak things to be more to their liking, but you can't shut off the rewrites. If these aren't critical queries, there comes a point where it's effort poorly spent, IMO.
| Dr-Pete0 -
Panda and Large Web Presence
Panda updates have hit microsites where content across the sites was either duplicated or "thin", although thin is often in the eye of the beholder. Keep in mind, and I mean this kindly, that "unique" is not always high-quality, and the quest for technical uniqueness can lead to practices where microsites are just spinning out versions of content with slightly different keyword concepts or ordering, etc. In other words, it's technically "unique", but most people wouldn't view it as valuable. Early Panda updates did hit certain kinds of spun-off content hard, including geo-located content. In other words, you spun out your plumbing services page for 5,000 cities and it only differed by city names and a few basic facts (even if technically unique), that's definitely something Panda came down hard on. Truthfully, though, it's really tough to tell without specifics. I'm more on EGOL's side of the fence - my gut feeling is that 20 micro-sites is excessive and I'd strongly suspect quality issues. Some questions that might help you pin things down: (1) Has traffic dropped across the entire cluster of sites or just the main site? (2) Can you pin traffic drops down to any given date, set of keywords, or pages? Drill down as far as you can - that's always the most important first step, IMO. (3) Are some of your micro-sites essentially dead - no traffic or ROI? You might not have to go all-or-none here. Odds are that some small % of your micro-sites are creating a large % of your value (let's call it an 80/20 rule). It's likely you could kill 10-15 of them with very little harm - at least that's what I typically see. You don't have to drop all 20 cold-turkey.
| Dr-Pete0 -
Google still listing pages from old domain after 2 change requests
maybe I should try re-establishing robots.txt and a site map and seeing if Google recrawls the old domain and picks up the 301's to the new one. If you can do this, then it is definitely worth doing. -Andy
| Andy.Drinkwater0 -
Internal duplicated content on articles, when is too much?
No probs Patrick. Sometimes it's unavoidable and would depend a lot more on the overall quality / trust of the site in general. When you say devalued, in this case the duplicate content would likely just be ignored - is this what you mean? That's not to say that everyone should go out and duplicate chunks of content - continue to focus heavily on making the content as good as it can possibly be. -Andy
| Andy.Drinkwater0 -
Sudden jump in the number of 302 redirects on my Squarespace Site
It does make sense Stephen, but this bit bothers me: The URL mappings must be used to redirect any link to a deleted page to an existent page. If I am reading that correctly then are they saying that 404's can't exist, just 302's? You don't always want to direct someone to a page, especially if it isn't applicable. In many cases, a 404 would be the right thing to do if a page is just being removed. Of course, if there is somewhere to redirect someone to, then you would do so with a 301 (permanent redirect) rather than a 302 (temporary redirect). I don't really understand their system, so might be missing something in translation. -Andy
| Andy.Drinkwater0 -
To include / at the end of a URL or not
Hey E, Yes, Google considered these two URLs separately. Usually the "/" indicates that, it is a directory and the URL without "/" denotes a file. It's better to put a 301 redirect to the appropriate page to avoid any problem. Checkout what Google recommends on this topic: http://googlewebmastercentral.blogspot.com/2010/04/to-slash-or-not-to-slash.html Hope this helps! Umar
| UmarKhan0 -
HTacess 301 redirect with special characters
You should just need to put quotes around the URI: Redirect 301 "/stages-points/">http://www.legipermis.com/stages-points/</a></p>" http://www.legipermis.com/stages-points/ This looks like it's a bad linked formed on the site though and I'd try to find that and clean it up.
| TheeDigital0