I always thought that if you changed a URL, whatever backlinks that linked to that URL are lost (or re-directed to the new link) but that all juice that was given to that page is lost.
Can anyone elaborate on this topic?
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
I always thought that if you changed a URL, whatever backlinks that linked to that URL are lost (or re-directed to the new link) but that all juice that was given to that page is lost.
Can anyone elaborate on this topic?
I have a website that is built on OS Commerce and I am planning to transition to Magento.
I was told that the transition to Magento would change my url structure.
How do I preserve my current url structure while migrating to the Magento platform so that I do not lose my backlink profile.
Do you need to optimize a 410 page like you do for 404 pages?
What does a visitor see when a page is 410 compared to a 404?
Can someone respond to the questions on my post? Thanks.
I have a site with paginated search result pages. What I've done is noindex/follow them and I've placed the rel=canonical tag on page2, page3, page4, etc pointing back to the main/first search result page. These paginated search result pages aren't visible to the user (since I'm not technically selling products, just providing different images to the user), and I've added a text link on the bottom of the first/main search result page that says "click here to load more" and once clicked, it automatically lists more images on the page (ajax). Is this a proper strategy?
Also, for a site that does sell products, would simply noindexing/following the search results/paginated pages and placing the canonical tag on the paginated pages pointing back to the main search result page suffice?
I would love feedback on if this is a proper method/strategy to keep Google happy.
Side question - When the robots go through a page that is noindexed/followed, are they taking into consideration the text on those pages, page titles, meta tags, etc, or are they only worrying about the actual links within that page and passing link juice through them all?
Devanur,
What I am asking is if the robots/google will view it as a negative thing for noindexing pages and still trying to pass the link juice, even though the pages aren't even viewable to the front end user.
Yes, but what if these pages aren't even viewable to the front end user?
If you have massive pages with super thin content (such as pagination pages) and you noindex them, once they are removed from googles index (and if these pages aren't viewable to the user and/or don't get any traffic) is it smart to completely remove them (404?) or is there any valid reason that they should be kept?
If you noindex them, should you keep all URLs in the sitemap so that google will recrawl and notice the noindex tag?
If you noindex them, and then remove the sitemap, can Google still recrawl and recognize the noindex tag on their own?
Hello,
I know that Google says that you are supposed to make anchored text links nofollow on press releases, but what about just putting the site url itself (example.com) and making it dofollow?
Is that okay?
Is it bad to utilize a custom codebase for multiple websites? Does that play a factor within Google?
Also, what about hosting sites with the same custom codebase on the same dedicated server?
As far as google crawling and de-indexing all of the pages with the noindex tag, is that a time consuming process before all of the pages are removed?
If you have a site with a few thousand high quality and authoritative pages, and tens of thousands with search results and tags pages with thin content, and noindex,follow the thin content pages all at once, will google see this is a good or bad thing?
I am only trying to do what Google guidelines suggest, but since I have so many pages index on my site, will throwing the noindex tag on ~80% of thin content pages negatively impact my site?
According to this article http://www.seroundtable.com/farmer-headers-13111.html
It sounds like I should be 404ing these pages since I never plan to re-writer them and I want them removed from my site and from the index.
According to this article http://www.seroundtable.com/google-robotstxt-advice-12759.html
They believe you shouldn't use robots.txt..
Anyone know the best option in this situation? Should I just 404 a handful of the 40k pagination pages every week/month until they are all 404'd?
I have a website that has around 90k pages indexed, but after doing the math I realized that I only have around 20-30k pages that are actually high quality, the rest are paginated pages from search results within my website. Every time someone searches a term on my site, that term would get its own page, which would include all of the relevant posts that are associated with that search term/tag. My site had around 20k different search terms, all being indexed. I have paused new search terms from being indexed, but what I want to know is if the best route would be to 404 all of the useless paginated pages from the search term pages. And if so, how many should I remove at one time? There must be 40-50k paginated pages and I am curious to know what would be the best bet from an SEO standpoint. All feedback is greatly appreciated. Thanks.