Category: Intermediate & Advanced SEO
Looking to level up your SEO techniques? Chat through more advanced approaches.
-
Can anyone see any issues with the canonical tags on this web site?
If the canonical link is not set, its possible you will get additional pages indexed for the same content especially if you are using a CMS. The best course of action would be to do a site search for your domain, and see how may pages are indexed. If you know you have 200 pages on your site, and you have 2000 indexed, you most likely have some duplicates showing up. We wrote an article about avoiding duplicates through htaccess earlier this month. Might be worth a read if you have not set any of those rules for your site. CMS's (Wordpress, Joomla) can be tricky to get duplicate pages under control. *edit Looking at your indexed pages, you are showing 122,000 links indexed for your site. If that isn't the case, you should look into assigning canonical links for all your posts, and if they are already indexed, processing redirects. You can also use Google's URL parameter tool to help, althought this tool should be used with caution if you don't know how to use it.
| David-Kley0 -
Questions Regarding Wordpress Blog Format, Categories and Tag pages...
Hi There To answer your first question about URL structure, I would see slides 12-15 in this Mozinar I did: http://www.slideshare.net/evolvingseo/hands-onwpseodanshure (you can also watch the whole Mozinar here). In short - if your site is mainly a blog, it's ok to do site.com/blog-post but if your site is a blog within a site you probably want to do site.com/blog/post-name You will want to 301 redirect the old URLs to the new one. I think it's worth it. For many answers about tags etc you can see my post about WordPress SEO here but in general; categories - index tags - noindex author archives - noindex for single author blog, index for multi-author with customized author pages dated archives - noindex subpages of archives - noindex You're not noindexing to avoid duplicate content, you're noindexing to avoid too many pages being indexed that don't need to be, and probably won't rank anyway
| evolvingSEO0 -
Duplicated privacy policy pages
Agreed. This won't result in a penalty. To add; as most privacy policies are linked from the footer, so you might want to add a rel="nofollow" to the link.
| seowoody1 -
Does Unique Content Need to be Located Higher on my webpages?
follow backlinks. site artchitecture and quality of content way above competition. I see businesses buying up 100+ keyword rich domains and ranking well for all domains. It tells me 2 things: 1) search engines are not always that clever, 2) I need to be patient, because of 1).
| khi50 -
Schema Markup for Magento
Hello Rachel, I had a look at each of these and probably wouldn't recommend MSemantic based on the user reviews. Google Rich Snippets for Magento only had 2 reviews, which is too little for my taste. If you want to go with an extension I would start with either of these two, one of which you've already mentioned: http://www.magentocommerce.com/magento-connect/rich-snippets-suite.html http://www.magentocommerce.com/magento-connect/seo-rich-snippets-google-bing-yahoo-schema-org.html You could also take the DIY approach: http://www.creare.co.uk/magento-product-schema http://gotgroove.com/ecommerce-blog/developers-toolbox-adding-rich-snippets-to-magento-products-with-schema-org-tags/
| Everett0 -
Recovering from a Redesign?
I'm getting something a little different out of this. I plugged rugcare.com into SEM Rush. It shows a considerable spike in traffic in June, then a very big drop in July. Joehadeed.com and bergmanns.com are showing relative spikes and drops as well. So there's a possibility that a bigger competitor is eating everyone's share. Though when I plug rugcare.com into Majestic SEO, it shows 55K redirect links pointing to rugcare.com. Majestic picked this up in late June, which may account for the sudden spike in rankings and traffic, followed by a steep decline in July. It also appears that the site received a number of links from housecleaningadvice.com with the anchor text 'upholstery cleaning tips - rug care'. They appeared to be nofollow, and I can't find the links anymore, but I wouldn't take a link from that site at all. It looks like it's part of a typical blog spam network. I don't want to scare you, but have you received any warnings in GWT? I also can't find these three pages on the current site via nav or crawl: https://web.archive.org/web/20131230215654/http://www.rugcare.com/carpet-cleaning https://web.archive.org/web/20131014021917/http://www.rugcare.com/oriental-rug-cleaning https://web.archive.org/web/20131014021812/http://www.rugcare.com/carpet-and-rug-repairs Was there any particular reason for their omission? Do you know how much traffic those pages used to receive? It looks like the oriental rug cleaning page would have accounted for a chunk of traffic on it's own. Now it just 404s.
| Travis_Bailey0 -
Using two 404 NOT FOUND pages
Hi, thanks for your reply. Yes for sure the response code will be 404 in the HTTP. I just was not sure whether two separate custom 404 not found pages was alright. Cheers
| RonFav0 -
Sandboxed?
Thomas, could you review your response and edit with a keyboard? Some of the formatting is off, making it a bit confusing to understand. Thanks!
| KeriMorgret0 -
Proper Form for Title & Description Tags
The title and description look good to me, if i was going to modify is myself, i would add some of this Flatiron loft for rent | West 21st Street to the description tag and also add more of a location in the title and description such as: NYC or New York City, ya know?
| benjaminmarcinc0 -
Better to use specific cities or counties for SEO geographics?
This is unscientific and based on the way I personally search, but I'd ask how people in the area refer to that area for some additional input. Here are some examples of the ways I have searched for locations in areas I've lived in (or had family live in). In the SF Bay Area, I used to live in Newark. There were about 30,000 people in the city, and it was surrounded by the SF Bay on one side, and Fremont on three other sides. I would verbally tell people in the region that I lived in Fremont, and I'd search for local businesses using Fremont instead of Newark, as otherwise I'd get results for New Jersey. I have relatives that used to live in Woodstock, VA. Everyone always thinks of Wodstock, NY, and it's hard to find local info, especially when searching from the West Coast. A lot of businesses describe themselves as in the Shenandoah Valley (and it was Shenandoah county), so I'd often search for Shenandoah, or Front Royal, which was the nearest sizable town. Other relatives live in Battle Creek, Iowa, a town of 800 or so people. Even with adding Iowa, I get way too many results for Battle Creek, Michigan. If I need to search for something (usually on ebay, looking for memorabilia) I will search for Ida Grove or Ida County. I know this really isn't an answer to your question, but more of some things to think about. Again, I'd ask (if you're not local to the area yourself) how people usually describe where they live, and look at search volume for that. Maybe also run some AdWords targeted to desired zip codes, and then look in the Search Query Reports in AdWords to see what cities people type in to modify their search?
| KeriMorgret0 -
Site Speed, is it worth it from a SEO point?
One easy way that I've found to help with pagespeed is with cloudflare's free subscription. It does a lot to speed up your site, such as auto minifying, css, html and javascript. It also caches your pages and greatly reduces server response times. It also does some other stuff and offers security features which is another benefit.
| spencerhjustice0 -
Client is paranoid about Google penguin penalty from getting links from a new website they are building
Excellent response Alan! Thanks
| VanguardCommunications0 -
How does having multiple pages on similar topics affect SEO?
Makes a lot of sense, thank you.
| JonathonOhayon0 -
Breadcrumbs for E Commerce Site
This is an topic that Google is very familiar with. Matt Cutts did a video specifically about Ecommerce websites and multiple breadcrumb paths for the same product. That video is here: http://www.seroundtable.com/google-breadcrumbs-seo-18285.html
| Ray-pp0 -
Best Approach to Redirect One Domain to Another
Hi Rich, Elaborating on FedeEinhorn’s answer, if the page structure is the same you could just redirect all requests to the same URI on your new domain, as he stated. You could do this very easily with a .htaccess file on the root folder of your old domain (providing you’re running an apache webserver like most people). To redirect using regular expressions and capture groups we can use the RedirectMatch directive, which would look like this: RedirectMatch 301 ^(.*)$ http://www.newsite.com$1 As simple as that, you’ve redirected all existing pages to the same page on the new domain. If you haven't used this before, here's a brief look at how that works for you: Firstly, RedirectMatch simply tells apache we’re using the RedirectMatch directive from mod_alias. 301 specifies a 301 SEO friendly redirect which passes all that lovely SEO juice to your new site. ^(.*)$ is our regular expression. It states, from the start of the requested URI (^) start capturing the request (using the brackets to show what we want to capture), capture it all (with . meaning any character or symbol and the * meaning 0 or more of the preceding . , which will lead to everything being caught by our capture group (the brackets). And the $ meaning the end of the requested URI. The final part of this redirect is specifying the page to redirect to, but as we have captured the request in the previous part, we use $1 to append our first capture (only capture in this distance) to the end of our new domain. If you have completely changed your site, you may wish to redirect all requests to your homepage or another page, it is as easy as modifying the previous code to redirect without appending our capture to the end of your redirection target, so this would be acceptable: RedirectMatch 301 ^(.*)$ http://www.newsite.com But since we don’t need to use anything from the requested URI, we should really remove the brackets (the capture group) for the sake of tidiness, resulting in: RedirectMatch 301 ^.*$ http://www.newsite.com You could use a mixture of these 2 code, for instance if your blog posts are all identical but your main site pages have all changed - this code would redirect all pages starting with /blog/ to their double on the new domain, but redirect all other pages to a /we-have-moved/ landing page: RedirectMatch 301 ^(/blog/.*)$ http://www.newsite.com$1 RedirectMatch 301 ^.*$ http://www.newsite.com/we-have-moved/ Hope that's useful, Tom
| TomVolpe0 -
Change Media Wiki urls to - instead of _
Hi Noah, I suggest you stop by Stackoverflow (http://stackoverflow.com/), where development questions can be better responded.
| FedeEinhorn0 -
Internal links question
There were two initial reasons for the 100 link "rule." 1. Google's crawl budget was limited 2. Each page has a given amount of authority it is able to pass on to internal pages. Less links means more authority for each link. The first part is out the door now. Google has plenty of crawl budget for most sites, and will give plenty if your huge website has enough authority. So, the problem that you need to evaluate is if you are spreading your authority too thinly across all of these pages. Would you be better off limiting it to certain categories, or are you going to be a rock star at link building, and don't need to care? After you consider that you will want to consider your users. Are you giving them too many options, and causing choice overload? Test different options, and see what performs better. There is a Ted talk about Choice Overload that used jelly as an example. One test had over 20 jellies to choose from, and one had around 6. At the end of the day people were more likely to actually buy the jelly when there were only 6 options. What the study determined was that having too many choices overwhelmed shoppers, and led to less sales. Moral of the story, don't give people too much jelly.
| WhoWuddaThunk0 -
Difference in Number of URLS in "Crawl, Sitemaps" & "Index Status" in Webmaster Tools, NORMAL?
Hi Niners: I have run a Xenu link report and there are no broken links or anything out of the ordinary. Could having additional pages indexed by Google devalue a site in Google's eyes? The additional pages co-incide with a decline in ranking. Thanks, Alan
| Kingalan10