I see 2 possibilities that might help explain things. Your site might possibly have some site architecture or internal linking issues that may explain why a search engine bot or seomoz bot cannot crawl internal links. If you give a url we could take a look to make sure there aren't actually any issues. Also OSE only crawls approximately 25% of links on the net so it may be possible that if you run some internal links on your site through the tool they might not be included in the index. I hope this can help further investigation that may assist in finding your answer.
Posts made by Bevelwise
-
RE: Only 2 internal links in OpenSite Explorer?
-
RE: If I add the the '&utm_source=MSN' parameter to my URLs in AdCenter, will this reset my history for the KW/campaign(s)?
I think if I understand your question correctly that you are asking whether adding the query string param's utm_source will it affect the stats in your campaign. No your keyword stats in MSN Adcenter's campaign will stay not be lost because you are only changing a destination url in your ad copy. If you're asking will G Analytics pick up the change and change historic data the answer would be no.
Also another tidbit - use &utm_term={keyword} as well because that's Adcenter's parameter to pass in the dynamic keyword being clicked as well.
-
RE: Keywords
Yahoo use to be the only ones who really took this into account, but since last year when Bing started running the technical side of their engine as well this pretty much become insignificant across the board. Bing tries to be more and more like Google every day so I believe that taking time to use the tag (even though it may only take a measly minute or so) after you know the target keywords for your piece of content I still think the disadvantage of also showing your competitors what terms you are trying to rank for outweighs the hypothetical pro of thinking people still take this into account.
-
RE: Country specific landing pages
The other implication that most international sites have is the way they handle content and different languages. You don't want to also create duplicate content issues of your site by increasing your content exponentially using either machine or user generated translations. We of course, always prefer user generated translations.
Another thing to check out is a post on the SEOMoz blog about handling duplicate content. The area I specifically wanted to point out is the Link rel=alternate section
http://www.example.com/path" />
This can be used to tell search engines that those pieces of content are just translated versions of another piece of content so there is no confusion.
-
RE: What should I do with my blog...V2
The whole thing is whether having a blog on a sub-domain or sub-folder makes a difference. Well since sub-domains do not inherently get any page authority from their root domain I see the sub-domain as a worse option. They are not correct that it's the same. The most common example I have heard and used is people who launch blogs on Wordpress.com. If the blog is launched in a sub-folder then it is seen as more of a part of the main website and thus authority will translate better back and forth for you. If you wanted an example from someone smarter than myself I would just look at the way SEOMoz does it...lol.
The most common problem I have when talking to clients about this issue is the way the original website was built. Sometimes the platform or CMS they chose is restricting to a degree to allow common blog platforms to handle url re-writing correctly and various other factors.
If you are given the option I would recommend a sub-folder setup. It may be easier for developers to do the sub-domain, which I usually hear from web developers. As far as downtime... just do the switch on a sat or sun night. I usually don't see that as a big deal and when thinking about the long haul I would rather have it setup correctly.
-
RE: Quotations about importance of social media and backlinks
Hi Julian,
Here's an article, "The Tweet Effect", I read awhile ago on the SEOMoz blog that sounds like it will help support you. It was an experiment on how social media (in this case Twitter) can impact rankings. Not necessarily a quote, but I think you will be pleased to read the results and could find a way to factor it into your presentation.
-
RE: Multiple Google Places listings under review
I don't think you're going to like the course of action here, because it is be patient. Google's database that runs Google Places has had issues with duplication and mixing up listings going back a long time. They have gotten better over time but are not perfect. I would not continue deleting and recreating things in their database because I think it will only cause more confusion. You could go to the Google Places forum and try to get some assistance there if you have not already done that. They are mostly moderated by the community, but some times you'll see Google employees responding to posts and expediting.
Was there anything in the listing that flagged it for review? Often times people try to use keywords in the title of the listing because it has improved performance in the past, but they do not like that. Moving forward if you want to make changes to a current listing I would just make sure that listing is verified and then process changes as opposed to deleting old listings and creating new listings on top of them, which is what it sounds like is happening, but correct me if I'm misunderstanding.
-
RE: How to optimize a Stock Symbol page?
On page optimization is only one factor of achieving the organic results you are looking for. The next thing I would ask is how authoritative the domain and individual pages are from a link perspective. If you haven't run the OSE tool on your home page and individual stock pages you wish to rank then do that, because that will tell you how strong or weak your website is from a link perspective. You'll want to look into a link building strategy and will work best for your business/website. On page optimization can set the foundation for your SEO strategy but links is what will give you the lift.
-
RE: Best keyword research tool
Wordtracker is also a tool that hasn't been mentioned yet. It is paid, but looks at keywords a different way and usually includes ideas for me that I have not previously thought of or discovered using SEOmoz's tools or Google Adword's Keyword Tool.
-
RE: Pages not cached
I would ensure that you have submitted an .xml sitemap in Google Webmaster Tools that includes those article pages. That will help. Also if you make an HTML sitemap on your website and include links to those article pages that will also help.
Also depending on how your site architecture is setup affects how Google crawls your website. Are those article pages 2-3 clicks away from the website? Are all the links crawl-able html?
-
RE: We have one particular blog post that is very popular. How can we direct that traffic to other parts of our website?
I would make sure to include anchor text links to related pieces of content or old blog posts that apply to the same subject matter in the body content of the post. Users are more likely to click on a link in a blog content area then they are to click on other navigation elements. If there were a desired page on your website you were trying to direct them to specifically then you could use an attractive graphical "call to action" but that doesn't necessarily mean they'll visit the rest of the website fully.
I would be more inclined to see what about that post makes it very popular and play on that again. Is there a possible follow up post you could do or another post on a similar subject? As more great content begins to expand on the website and users continue to come they will notice other areas of the website that may also apply to them.
-
RE: Canonical tags and SEOmoz crawls
I am actually unsure if SEOMoz's crawl report does not necessarily take into account all the ways duplicate content can be fixed. Another thing that will create a similar affect is when you use Google webmaster tools to handle parameter handling. I don't think it's as big of an issue to just have these popup in your crawl report. We use these crawl reports as indicators to help us identify these issues. If you've fixed the issues with best practice and seen positive results in the search engines that's really what matters.
-
RE: Ditching of spammy links - will it be of benefit?
Really the only way to get the links fixed is to contact the webmasters of the websites and have them removed. Either that or remove the destination pages url, move the content and move let the destination 404. That would only be if the links pointed to a sub page instead of the home page, because obviously no one wants their home page to 404.
The next thing to focus on in my opinion is how bad these links are hurting you. There are 2 ways I look at bad/useless links. Search engines can just say a certain website is pretty much crap and that all links don't pass any value and are basically null to a link profile. Then there is the really bad stuff that actually can affect you negatively. In my opinion, I believe most links that are bad end up just being null or useless to your link profile.
Since you're concerned about the bad links and you want to attempt to remove them, first, I would put your effort into a "white-hat" link building strategy and then once you've built up more links you could go through and start to try and correct old spammy links. That way you wouldn't have really lost any rankings for losing a handful or more of links if your profile, since you're replacing them with new links.
-
RE: Duplicate Page Title Elements
The approach with something like this is try to be more broad on the home page level and then more specific on category and product pages. Another factor that helps those category pages rank is your internal anchor text, and that should ideally say "Dog Shock Collars". I think at your home level you will want to go more broad with your on page target terms as long as you have the link profile to get ranked for those more competitive terms. If you do not have enough link authority to get good rankings with those terms then I would be in support of keeping those home page terms specific until you build a link profile that can help you rank for them more competitive terms. At that point, then change your on page optimization to focus on the broader terms.
-
RE: 3 Different Home Page URL's Being Indexed?
Duplicate content is never a good thing and can be handled a number of different ways. In this case I would do a number of things to make sure you are not causing confusion and ultimately segmenting out the link profile between multiple url's. I would
- 301 redirect /home.php to /
- Install a canonical tag on the home page
- If you don't already have your Google webmaster tools registered go do that, and then under site configuration use parameter handling tell Google to ignore the query string parameter ?cat= and set that to ignore.
Even though you are getting multiple versions of your home page ranked, are they for different keywords? I believe that if you fixed the duplicate content issues you would retain your rankings with your consolidated home page.
-
RE: How to have an internal call to action with an anchor that employs the correct internal linking practices?
Internal keyword rich anchor text can help with your SEO strategy, but I think in this case since you're linking to portfolio pages on your website if I understand correctly. I think a better keyword rich anchor text would be to have anchor text be "<nickname>web design portfolio" which could link to your home page. Your home page is going to be the best place to employ your strong call to action, which I believe is handled well by allowing users to enter their email address to apply for beta. This would be what I would use:</nickname>
<a href=""><nicname>'s web design portfolio</nicname></a> - StyleJam showcases your portfolio
-
RE: Link Building
There are a couple different strategies I would mention here that may help you out. As far as "tools" I believe they are best used to find link prospects, because there aren't really any tools that will just get links for you (at least quality ones). I would use OSE, which has free and paid versions. Another free one would be Yahoo's site explorer. In each of these cases I would say do some competitive analysis to see what links your sites competitors have. Then contact these websites to see if you could get a link on their site as well. A paid tool I have used and would recommend to help find link prospects as well is Raven's SEO Tools.
Another free link building technique I would recommend is trying to find guest blogging opportunities. If you haven't made an account on MyBlog Guest then there platform might help you out to find these opportunities. I hope these strategies can apply to your business/website because it really depends on what is available on the web and what will work for your website.
-
RE: Recommended Wordpress Plugins
No one has brought up Yoast yet. This is my favorite so far because I want a plugin that handles everything. Yoast handles almost everything we've needed on a lot of sites and blogs we've worked with.
-
RE: Duplicate content.
A really great write up that Rand did in regards to canonical tags is here in SEOMoz's blog it's from a while back but still relevant today.
-
RE: How long does it take for Google to de-index urls?
Google won't give a specific time table as to how quickly they would update. We would all hope the next time they re-crawl that specific page and then shortly after that update the index, but examples have shown it's not quite that fast. If you have registered your Google Webmaster Tools account you can go under "Site Configuration" -> "Crawler Access" -> "Remove URL" and send them an actual request that you would like it removed from the index. This will help expedite the process.