Just a side note Jake, if it gets really bad for any pages and you don't feel that's an option you can always use the:
Hope this helps
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Just a side note Jake, if it gets really bad for any pages and you don't feel that's an option you can always use the:
Hope this helps
I have to say I agree with Sha on this one.
If you are not confident in using .htaccess then I wouldn't bother. I think there is a much easier solution:
1- As Sha said, use webmaster tools to tell Google how to handle these parameters, this should slowly start to take them out of the index.
2- Add rel=canonical to all your pages, this way even if parameters are added, the rel=canonical will always point back to the original and remove any risk of duplicate content.
I hope this helps.
Craig
Hi,
This looks like a copy paste error to me. If you look at the cource code of your site compared to another site with a canonical that you know works, you'll see that the quotes around the canonical and the URL are blue. They should be a different colour. If you look at the image I've shared, you'll see the other links in the source code have different colours for the quotation marks.
My guess is you pasted this from a rich formatting document like MS word or something or the quotes are the wrong order. To fix it, open up a plain text editor like the basic notepad and write it out again. Then add it to the site without formatting. This should fix it.
If you want to double check the HTML, I recommend downloading a free text editor like Visual Studio Code which makes it a bit easier to see the formatting is correct as you're writing HTML. See the attached image as an example.
Let me know if this solves your problem.
Hi Isaak,
Have you done anything to the site recently? That's the mot obvious place to start, what could it be? I checked the backlinks to the site and noticed there are a lot of other domains redirecting to the site, for example":
<colgroup><col style="mso-width-source: userset; mso-width-alt: 11562; width: 271pt;" width="361"></colgroup>
| http://www.woodfurniture.co.uk/ |
| http://www.thefurnitureemporium.com/ |
| http://www.lightoakfurniture.co.uk/ |
| http://www.mattresssolutions.co.uk/ |
| http://www.darkoakfurniture.co.uk/ |
| http://www.bedroomfurnituresolutions.co.uk/ |
| http://www.bedroom-oak-furniture.co.uk/ |
| http://www.tresoakfurniture.co.uk/ |
| http://www.thefurnitureemporium.co.uk/ |
Were any of these (or other sites) recently redirected?
Craig
Great! I'll mark this as resolved then.
Craig
Hi,
I think it depends on the site, but my initial thoughts are that there probably isn't a lot of value to having those pages at all. Imagine the site grew to 50X the size it is now, would it make sense to have those pages? By that point, the site would have more thin/no indexed pages than new useful ones.
I think it's important to think from a user point of view. Just because you add noindex doesn't mean Google will take the page out of the index in a hurry. That means there is a lot of potential for people to still find you in Google, arrive on an expired ad, have a bad experience and leave. What about trying to do something a bit more useful like redirecting them to a closely matched product category above? Even doing that might be confusing to users unless you have an overlay explaining that the product is no longer available, so you are redirecting them to a different page.
Craig
Hi June,
Can you share the URLs so I can take a look? I won't be able to help other than just link to the standard hreflang guidelines unless I can look at the specific site.
Craig
Hey Adam,
I've never seen any conclusive evidence as to what the order is, I've heard some people say that it could be the order in which the pages were first crawled, the order of importance etc but to be honest I wouldn't worry about it, or use that as you basis for what pages Google think are most important.
Spend time creating a good clear navigational structure when the site is built and submit a site map, between those two you should be able to let Google and other search engines know exactly what pages are the most important.
If the first page isn't your home page, try making sure that Google isn't showing any personalised results, a good way to check is to use chrome and open an 'incognito' window and do the same search. Whatever the result I would worry unless the home page isn't indexed at all.
Hi Even, this is quite a common problem. There are a couple of things to consider when deciding if Noindex is the solution rather than robots.txt.
Unless there is a reason the pages need to be crawled (like there are pages on the site that are only linked to from those pages) I would use robots.txt. Noindex doesn't stop search engines crawling those pages, only from putting them in the index. So in theory, search engines could spend all there time crawling pages that you don't want to be in the index.
Here's what I'd do:
Decide on a reasonable number of facets, for example, if you're selling TVs people might search for:
But past 3 facets tends to get very little search volume (do keyword research for your own market)
In this case I'd create a rule that appends something to the URL after 3 facets hat would make it easy to block in robots.txt. For example I might make my structure:
But as soon as I add a 4th facet, for example 'colour'- I add in the filter subfolder
I can then easily block all these pages in robots.txt using:
Disallow: /filter/
I hope this helps.
Hi Prime,
I don't see there being a problem with this at all, as you said it's helpful for users, however, make sure that it makes sense from a grammar point of view as well, for example, "see all TV's" makes sense, but "see all car insurance" does not. Don't try to shoehorn a keyword in just for the sake of it.
I agree with Karl that breadcrumbs are a good idea both from an SEO point of view and a user point of view.
I hope this helps.
Thanks,
That’s quite cool to see, I take your point, it would be interesting to test by setting up a brand new site, throw up a few pages and see what happens. Common sense would say Google should always show, home page, contact page etc but it's not always the case as we have seen.
Have you had a look at what your most popular blog posts have been to see if that correlates in anyway?
Craig
I've had a look at this in a fair amount of depth but I still can't get it working. I think there are a few things at play here:
Sorry I couldn't be more help. I'll add this reply to the Moz page too for the benefits of others.
Although I agree with what some of the others have said, it's not enough to just disavow all links. Google want to see that you have actually tried to remove the links. So make sure you keep record of the sites you tried to contact as proof that you have tried. Google want to make this is hard for you, otherwise there is no incentive to just do the same again. Don't be surprised if you need to try upwards of 5 reinclusion requests.
Some good resources on the process are here:
http://moz.com/blog/ultimate-guide-to-google-penalty-removal
http://moz.com/ugc/the-anatomy-of-a-successful-reconsideration-request
Good luck
Hey,
The general rule is to use the most appropriate level above. For example:
Thing > Place > Landform > BodyOfWater > Waterfall
In the waterfall example, if there wasn't a specific entity for waterfall, you would use the level above "BodyofWater"
This isn't ideal, but it's the best you can do without something more specific.
Craig
Hi Landon,
The "site:" operator is probably what you're looking for. To use Bob's example of finding questions about SEO on Yahoo answers, it would be:
"site:uk.answers.yahoo.com SEO"
I hope this helps.
Hi,
I don't see a redirect either. I tried fetching from the UK and the USA and neither redirected me. Are you still having this problem?
Craig
Hi, the higher up the architecture the links are pointing to, the more pages the authority has the potential to influence if. For example building links to the homepage will also help any pages that it links to. Building links to one page only will have a larger impact on that particular page but won't benefit others as much. Does this make sense?
I hope this answers your question, unfortunately I don't know of any case studies since it can't really be compared. I'm still not 100% sure of the question if I'm honest.
Craig
Hi G,
I wouldn't worry about it from an SEO point of view, changes of that size will have little or no impact. It's strange from a user point of view though to have the main heading of a page below the fold so I might consider doing it anyway.