"Links that link to: any page on subdomain"
The remaining links point to another subdomain. Try selecting: "Any page on this root domain" and see what you get
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
"Links that link to: any page on subdomain"
The remaining links point to another subdomain. Try selecting: "Any page on this root domain" and see what you get
SEO-wise: have your keyword in the first position in the title
Aside from that you want to encourage a high CTR with a title that describes the page and entices a click. If you have room for a brand name boiler plate, I like to add it. I only remove it if the title is too long and would get cut off in SERPs.
If it link is a direct link to your site (view source and make sure its ), then only the top 3 points are taken into consideration if its nofollow or not.
Common consensus: 301 redirects will drop your rankings for ~ 2 weeks but you should bounce back in rankings after then. You do lose a small amount of link juice but if the word "guides" is a common search term for your widget queries, it might be worth the move.
In addition, I would review the backlinks you currently have, sort out the most powerful backlinks, and contact the respective webmasters to change the URL to the new address and avoid the redirect.
http://hashtags.org/ - Give trends for the more popular hashtags. In your case, I just popped both into search (can do it on Twitter too) and just see which has been getting more results (in this case, I'd go with #carpics)
If someone knows of a better service (maybe checks synonyms, related terms, etc) to choose the best hashtags possible, I'd also be interested.
Check out http://www.seomoz.org/blog/smarter-internal-linking-whiteboard-friday
and http://www.seomoz.org/blog/internal-linking-strategies-for-2012-and-beyond
Those deal with footer links across multiple domains/regions.
A lot depends on what the mobile forum is - a separate url or just a css change? Read this
In the end, you need to tell Google what version of the site to serve to regular visitors and which to serve to mobile users.
If it's such a big forum, the crawlers are probably constantly on the site anyways so sitemaps would do you less good than optimizing the actual forum.
However, if you are inclined to create a sitemap, consider making a sitemap index and splitting up the threads/sections/replies into separate sitemaps (at your discretion). The goal being that each sitemap is updated as often as the next the crawler can keep up with it (so if sitemap gets checked 3 times a day, make sure all the links that need to be crawled would probably appear during those 3 checks)
There are many factors involves and I would really need to see the site/categories to give a specific answer. However:
Changing urls will result in loss of rankings. If you 301 redirect from old to new, you will be able to recover most of the rankings but there is ~2 week period where you will drop out of serps.
For duplicate content, assuming all pages have been crawled by G, compare you actual # of pages to # results returned by site:domain.com. Also see if there are duplicate title tags (usually results in duplicate content)
Tough to say. If G is ranking you well and has been for a while - don't fix what ain't broke. There are many examples of people taking a long time to recover from mistaken 301 redirects or improper canonical tags which tanked their rankings - you're looking to do this on purpose.
that being said, you will probably need a either canonical tags and/or 301 redirects to maximize the site's SEO potential.
If he incorporates the rel-alternate-hreflang tag, that should deal with both issues.
Instead of deleting, you can just noindex + add a link to the original article.
Instead of deleting, you can 301 redirect to the original article.
This removes all duplicate content issues.
According to Google(last paragraph), you shouldn't worry about duplicate content issues:
"Websites that provide content for different regions and in different languages sometimes create content that is the same or similar but available on different URLs. This is generally not a problem as long as the content is for different users in different countries. While we strongly recommend that you provide unique content for each different group of users, we understand that this may not always be possible. There is generally no need to "hide" the duplicates by disallowing crawling in a robots.txt file or by using a"noindex" robots meta tag.
However, if you're providing the same content to the same users on different URLs (for instance, if both example.de/ and example.com/de/ show German language content for users in Germany), you should pick a preferred version and redirect (or use the rel=canonical link element) appropriately. In addition, you should follow the guidelines on rel-alternate-hreflang to make sure that the correct language or regional URL is served to searchers."
Can you set a canonical/redirect on the page that was incorrect pointing back to the correct page?
i.e. page1.html had wrong canonical to pgae1.html -> change pgae1.html canonical to page1.html
Overall, I think it's just a matter of time before Google is able to recrawl and fix itself... it's odd that canonical + noindex is slower than just noindex. Do whatever you can to get G to recrawl the pages.
Maybe the spacing is off when you posted it here, but blank lines can affect robots.txt files. Try code:
User-agent: *
Disallow: /cgi-bin/
#End Robots#
Also, check for robot blocking meta tags on the individual pages.
You can test to see if Google can access specific pages through GWT > Health > Blocked URLs (should see your robots.txt file contents int he top text area, enter the urls to test in the 2nd text area, then press "Test" at the bottom - test results will appear at the bottom of the page)
The whole site is wrapped in a form, styled with tables (probably just coded straight from PSD slices), has tons of inline javascript, lack analytics code, no images have alt tags, the footer text is printed after the closing html tag... pretty much everything haha.
Yup, other sites will link to that post (improving its PA) and it will send more juice to your site via the link (improving your PA).
Your title is too long: Dog Training Silver Spring MD - Puppy Training - Dog School - Dog Training - Academy Dog Training By Haywood
108 characters with spaces.. you should shoot for ~70 (I believe it's pixel based but 70chr is just easy to deal with)
Since Google can't display the full title in the search results, it took the most relevant text from the title "Academy Dog Training" and displayed that instead.
Fyi, the site in general is very poorly coded and looks like its got a keyword stuffed footer too.
Check out this post on Google and long title tag
SEOMofo title tag length experiment - 107px title is the longest allowed length
No. If a high authority site creates a blog post with your link in the article, that is a low PA link from a high DA site (at least when it is first posted). From what I can tell, all links improve your PA/DA - just a matter of how much.
The higher PA and DA linking to you, the more PA and DA your site gets.
Ouch, sorry to hear that. I just added the htaccess code to one of my dev sites and it seems to redirect just fine (200 when visiting www, 301 when visiting non-www). I'm guessing that Google is going to the non-www page and is being redirected to the www version. Maybe switch the code to do the opposite (assuming your goal was to canonical/consolidate the urls). Message me if you want the htaccess code.
For fixing your site rankings, they will most likely bounce back on their own in due time so the best thing to do is try to get G to recrawl all the pages ASAP. I would:
Make sure you add and verify both the www and non-www urls of the site in GWT, then choose which one you want to be primary.
Change you sitemap to have have frequency/priority crawl rates, have the correct www/non-ww links and resubmit your sitemap in GWT.
Share links on twitter (they are crawled very often)
Create an rss feed with your pages and ping it
Good luck getting it ranking again.
-Oleg
Bad strategy. Non-adult site linking to adult related content is a big no-no in Google's eyes - that's why your articles were rejected. Article directories that would accept your articles are probably filled with spam and are low quality.
I don't really deal with adult websites but you should check out www.netpond.com - the largest (at least when when I last checked) adult webmaster board.
You can also use OSE to look up PA/DA of a page and it's backlinks (including internal links).