In this very rare case, they would get a 404, with a link to the home page.
to 301 every page in case it has a bookmark is a waste of effort
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
In this very rare case, they would get a 404, with a link to the home page.
to 301 every page in case it has a bookmark is a waste of effort
This sounds like a job for a canonical tag.
they will be linked to by internal links,
There is no penalty for have duplicates of your own content, but having links pouring away link juice is a self imposed penalty.
The problem with robots text is that any link pointing to a no-indexed page is passing link juice that will never be returned, it is wasted. robots.txt is the last resort, IMO its should never be used.
The thing is, his site has no penalty on the site, only on the bad links, removing the bad links that are already being ignored will do nothing
"we are taking targeted action on the unnatural links instead of on the site’s ranking as a whole"
I'm not sure you have a problem, why not let them all get indexed?
you need to explain that to them, and outline your efforts
none, but don't expect much.
some good news and some bad news.
Links got long ago can come back and bite you any day.
The fact that google said that they have not penalized your site, but rather just the links, is not all good news. it means that they have ignored the links, that why your site has dropped, removing them will not lead to a rise in rank, it means that where you rank now isa where you will rank without those links, removed or ignored.
At least that is how I understand it. I would still remove them in case I am wrong
do you have any actions against you in WMT?
How did you get 100k likes on Facebook? That's a lot, did you get them legit?
I don't know what slide share is, but if it is content on another site, then I would get it on your own site.
I would look at canonical and rel previous next,
Also I would establish do you have a problem?
duplicate content is not always a problem, if it is duplicate content on your own site then there is not a lot to worry about, google will rank just one page. There is no penalty for DC itself, if you are screen scaping then you may have a problem,
People do it to stop scarpers, but if your going to write screen scraper it would not be hard to remove canonical tags as well. so I don't think much of the idea.
Bing recommends that you do not use self ref canonicals tags. It could be that a self ref canonical tag may be followed as is alluded to by Bing, meaning that lose a bit of link juice thought the redirect.
There is no ideal length, its more like writing a play, its not the length of the play that makes it a success, its how well you portray the story, how you set the stage how you identify the charatures
robotx.txt is a bad way to do things, because any links pointing to a noindexed page wastes its link juice. using noindex,follow is a better way as it allows the links to be followed and link juice to return to your indexed pages.
but best not to noindex at all, and find another solution if posible
Because google will drop that url and crawl the new one by itself.
If all you are doing is changing domain then one 301 will do all anyhow.
but if you are 301'ing page by page, then just do the ones that have external links.
Just that a count of internal links. I don't think it is the most helpful stat.
Why it is not reporting all your links I don't know, but I can crawl your site for you using 6the Bing API and can tell you if you have a problem that's is blocking the links
Linking out gives away link juice. That's how PageRank works. Someone links to you, they give you some of their PageRank, you link to someone you give them some.
Matt cuts once said it can be beneficial to link out but did not elaborate. I believe he means as a user experience. But the google PageRank algorithm shows you give away PageRank
I don't understand if you look at source code and see the text url, then that is how you are linking.
Using text rather then a ID gives you a chance to insert a keyword so text is better then ID.
Linking to an ID then redirecting to a text url as many CMS do, loses link juice on the redirect.
If mywebsite.com is 301'ed to some other page, then mywebsite.com will no longer be reached buy requests, so the content will never be read.
The content when talking about a 301is irrelevant, all a 301 does is redirect the request. when google follows a link to your old page it is redirected to the new page and the link juice will now fall on the new page. The content on the new page is read instead,