Category: Technical SEO Issues
Discuss site health, structure, and other technical SEO issues.
-
Search Operator Link: How accurate is the data?
How often do you guys use the search operator "link:"? Not very Often How accurate is the data? Its accurate, but it only shows a small percentage of the links Why the numbers of links when we use the parameter is way lower than the number on Google webmaster Tool or open site explorer? It is lower because in the past people used this search operator to find links pointing to competitors to help with their own link building plan It only show the most powerful links? No it shows a random selection most of the time.
| gazzerman10 -
Duplicate Page due to category and tags - Wordpress Website Issue
This is a common problem in WordPress. The tag archives, category pages and author archives can create many multiples of the same content and cause numerous duplicate content issues. Most people choose to NoIndex their archives since they have potentially little Search value but they do/can have value from a User perspective. Setting posts to show snippets on Archive pages can lessen some duplication issues as well. You should also double-check the tags you are using as one-off tags can cause a lot of problems with duplication. If a post has 7 tags on it and those 7 tags are only being used for that one post, you now have about 10 versions of that post sitting on your site. Make sure to only use relevant tags and that the tags used are then also used as needed on future posts in order to tie together related posts. All the functionality to NoIndex archives and change posts to snippets can be found either as part of your current theme or the Yoast plugin is a good download to help manage those things.
| MikeRoberts0 -
Detail page popup questions for real estate client
Your developer is correct (partially) :). You see, that DOES affects SEO but in a good way. Over time, Google has learned to recognize what's best for the user, and from my personal point of view, having a "lightbox" (that's how that "window" is called) is far better than opening a new page if you can present the property details better. Whatever looks better for the user, then it will also look better for Google. Google is also capable of running and understanding Javascript, therefore you shouldn't have any problem, even less, when the link actually points to the page, so search engines unable to run Javascript can still scrape the site perfectly. Hope that helps!
| FedeEinhorn0 -
What is the best way to change tons of 302 for 301...
Thanks, I'll try do that the messy is worse than I thought...
| Felip30 -
What is the best way to handle these duplicate page content errors?
You can just add a canonical url tag. http://moz.com/blog/canonical-url-tag-the-most-important-advancement-in-seo-practices-since-sitemaps
| Felip30 -
Why my website main page suddenly disappear from Google search?
Make sure you disavow the right links. if it's done wrong your website will never recover. other good things your can do is get fresh links, new links from other sources. depending on how much time you've got you could recover.
| Stewart_SEO0 -
Redirects & 404 Errors
Awesome Dan, thanks again for helping me out! I implemented what you suggested and I'm still having the same issue. I think it has to do with the way the old URLs were setup, which included "index.php" before every folder (i.e. http://orchardsinn.com/index.php/specials) When I implement the redirect: Redirect 301 /specials/ http://www.orchardsinn.com/special-offers/ Then clear my cache and attempt to click the old link, it takes me to the site with a 404 error page. Then, when I try to navigate to another page via the navigation, they all return 404 errors. Any thoughts?
| marisolmarketing0 -
Linking shallow sites to flagship sites
Your first link didn't work. Links from these shallow sites probably won't help much for passing link juice (low authority presumably) but if they're getting traffic then you might want those visitors to see your flagship brand where you have the bulk of your content. In that case, it would benefit the visitor and therefore would be worth doing. As always be careful with linking to your flag ship site if the shallow sites are linked to by spammy sites. It seems to already doing pretty well so taking chances for minimal benefit probably wouldn't be worth it.
| Harbor_Compliance0 -
Does the order of results from "site:www.example.com" tell us anything?
On every result on the first 6 pages the domain name is in either the meta title or meta description ( Bar the very first result which is the homepage). The landing pages do not have the domain name in them, So Matt's point that "site:domain.com" is really "site:domain.com domain.com" makes a lot of sense. Also a good few of my landing pages are ranking, while 95% of the search pages ( that are on the first 6 pages) are not ranking ( 5% that do rank were once link to from the main site , which has been removed, but google is still stubbornly ranking them over the proper landing pages even though they don't have any back links anymore)
| PaddyDisplays0 -
Are backlinks the reason for my site's much lower SERP ranking, despite similar content?
Thanks for both of your responses! @IrvCo_Interactive I do have a Webmaster Tools and I do not have any warnings. But I do see 70 errors under Search Appearance > Structured Data, all regarding Missing: author and/or Missing: updated. I will look into what that's about. I am working to fix the weird subdomains coming up in Google for our site. We never used these domains and someone suggested to me that this might be a result of a wildcard A record in my DNS. This other Moz article mentioned to use .htaccess to 301 these, so that might be the fix I use: http://moz.com/blog/find-your-sites-biggest-technical-flaws-in-60-minutes I am not quite sure what I need to do to make sure my site's internal structure is solid, I just set it up in ways that seemed intuitive from a user perspective. I'm sure there are articles out there on this subject though. @Moosa Hemani Indeed that site http://www.hvhtek.com/associates/surfopt/surfopt.html does use a few paragraphs from our site. What tool did you use to identify that this duplication was out there? I have done almost no work on link building and neither has anyone else, so in a sense what we have now is our natural link profile. But our site has been around for a long time so we may have gathered some bad links anyway.
| erin_soc0 -
My site is not being regularly crawled?
Google may have reduced their crawl budget for your site if they found enough pages to be low quality, thin, duplicate, etc... Here are some examples that you can probably apply a noindex tag to in order to reduce crawl budget waste on them. Google these: site:howlatthemoon.com/ inurl:tag 1,360 indexed pages that don't need to be in the SERPs site:howlatthemoon.com/ inurl:tag page Even if you want the tag pages indexed, you don't need their paginated pages indexed too site:howlatthemoon.com/ inurl:page inurl:category A few dozen category pagination pages in the SERPs, many more on the site that have been booted from the index Example: http://www.howlatthemoon.com/dueling_piano_bar/category/nightlife/page/2/ I would install the Yoast Wordpress SEO plugin, which should fix these in the following ways: Use rel canonical tags Use rel Next/Prev tagsNoindex tag pages etc...
| Everett0 -
Removed .html - Now Get Duplicate Content
Handle this with care and I'm not responsible for breaking anything on your site. RewriteRule ^(.*).html$ http://new.site.com/$1 [R=301,NC,L]
| Martijn_Scheijbeler0 -
Where is Schema, Twitter cards and OpenGraph code need?
Hi Adrian, You want OG tags and Twitter card tags to be specific to each page on your site. For instance, pagea.html's tags should relate to that page's content and pageb.html's tags should contain information related to pageb.html's content. On Schema, that is going to be specific to the content...so if you've got a review on your home page and the same review is on a second page, then the same Schema markup might be contained on the home page and on the second page. Hope that helps. Let me know if you need more clarification. Thanks, Matthew
| Matthew_Edgar0 -
How to keep old URL Juice During Site Switch
Hi Michael, Thanks man that means a lot! All the best, Thomas
| BlueprintMarketing2 -
What should I do with URLs that cause site map errors?
Hi Christy, Unfortunately not!
| Ideas-Money-Art0 -
Any ideas why this site is being penalized?
Hi Jim- There are plenty of good articles about Penguin recovery, but here is the main goal: Remove unnatural backlinks: Look for non-relevent backlinks, backlinks with manipulative anchor text or unnaturally high anchor text ratio, sitewide links, etc. This can vary with each site and please don't look at my blurb and just start getting rid of backlinks. There has been a ton of discussion on Moz about Penguin recovery, which you should read up on: http://www.google.com/#q=penguin+recovery+moz.com Do a lot of research if you are going to tackle it yourself instead of hire it out. Best of luck!
| anthonydnelson0 -
Do other search engines use meta keywords
Yes there is there are a lot of Search Engines out there and some of the old ones still look for keywords. Some SEO say to use them and some say its a waste, I always find that more is always better. Keywords can always help seeing that you need to know what to market in your text. So if you did the research and have keywords to implement why not add them into the meta. But no Google, Bing and yahoo don't. Joel Newport
| CWPSEO0