Was wondering if/when the videos will be made available to participants...
Thanks!
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Was wondering if/when the videos will be made available to participants...
Thanks!
First there was the NOFOLLOW meta tag for page-level exclusion and then Google adopted the more granular rel=nofollow attribute for individual links on a page. I find that too many SEOs overuse the rel=nofollow attribute when there is a much more elegant solution available. The reason for this is now myth formerly known as the abused tactic called PageRank sculpting. I had a well-known culture/nightlife site in NYC as a client that had placed literally thousands of rel=nofollow attributes on links throughout the site... granted this does not seem to be your problem but I digress... 
To illustrate my point, Matt Cutts discusses how rel=nofollow attributes affect how Google passes PageRank to other parts of your site (or more precisely how nofollows decay the amount of link juice passed). In the case of a few pages or even large directories, etc, I would do the following:
Saving your Googlebot crawl budget for only high value pages is a great way to get more of those pages in the Google index providing you with more opportunity to promote your products, services, etc. Also, limiting the number of rel=nofollows used and allowing link juice (or Page Rank) to flow more freely throughout your site will prove beneficial.
@alhallinan - great thumbs-up-bait question! 
I would suggest you implement an absolute 301 redirect from the 'red shoes' page to the category parent page as opposed to the 'brown boot' page. Although both items are 'footwear,' the category parent for 'red shoes' strikes me as more semantically relevant than a lateral redirect to a less semantically related item like 'brown boots.' Keep in mind though (I don't know your site architecture) that this could be splitting semantic hairs depending on your information architecture and navigation schema. If you ever need to revive the 'red shoes' page in the future, you can simply break the 301 although link juice and link quality diminish over time thus the page won't have the same strength as before.
It is certainly possible for a site to be harmed with manipulated inbound links from bad neighborhoods. It is a trademark tactic emplyed by black hat SEOs against competitors. Granted, a handful of links wont make a difference but a concerted effort on a negative link campaign can and will get your site hammered. This is especially true in highly contested market verticals such as insurance, credit scoring, mortgage, etc.
Not all URL shorteners use 301 redirects - some use 302's which won't pass any link juice - so be careful which service(s) you use (bit.ly is fine in this case ==> here's a larger list to peruse). You can also roll your own shortener that has some benefit re: keywords/relevance inurl as well as flexibility is choosing which type of redirect to use.
Hope this helps. Be well!
It's not so much that it is an oft used tactic but more in how the tactic is executed. It is precise in how it is accomplished. Placing inbound links on rotten c-blocks known for spam, spam rings, or malware hosts; placing paid links (i've known folks who have gotten hammered with just 10 paid links) on well-known txt link brokers, etc. All your competitor needs to do is find a sleazy corner of the internet to target you and it can be flagged by google with astonishing speed. There are black hat forums that post this sort of targeting information.
Yes - follow the link in my expanded answer above... the ink points to Matt Cutts original article from February 2009 explaining how/when/why the change was made.
Like the other two responses, the answer is no. But I would also like to point out that PageRank is probably not the best metric to emphasize either. Check out http://www.seomoz.org/blog/what-is-googles-pagerank-good-for-whiteboard-friday for an updated discussion about PageRank from last week's WhiteBoard Friday.
Be well!
Hi Derek -
Great question and thanks for engaging. First of all, let me say clearly that I don't engage in black hat tactics. I'm Dir of Search Marketing for a lead generation company and so must protect our properties from such attacks. It can be a very dirty business.
I've had many discussions with other professionals in our field and it appears that Google's web spam team (which is quite small, relatively speaking) doesn't have the bandwidth to police all market verticals at once and they can't rely on doing this algorithmically with 100% accuracy. So there are gaps in coverage, etc. They'll generally go after verticals that have had a large number of FTC complaints, abuses, etc. A lot of dark corners such as 'payday loans' are generally left alone... do a search and look at the link graphs for the entire 1st SERP... pretty amazing stuff. 
I agree with you that there is a lot of bad information put out by the woefully misinformed. You can still rank sites using inbound links from forums and low quality sites (generally in concert with manipulated anchor text) with no problems. Some neighborhoods are worse than others - I dont know your site so can't really do anything more than speculation here. The kind of attacks I am talking about are highly targeted and with a very specific goal in mind: to burn your site.
Hi Andrew -
Shorter URLs generally out perform longer URLs. Also, fewer directories or file paths in he URL makes it easier to distribute link juice to the detail pages throughout the site (this necessitates a fairly flat file structure though and may not be the right solution for larger sites with diverse content, products & information requiring more complex taxonomies - this doesn't seem to be necessary in your case however).
I would suggest www.gooverseas.com/internships-abroad-china ... you get your keyword phrase in the URL with an economy of words in the process - leaving out '-in-' will save 4 character spaces. You can make up for this in your on-page key term targeting using meta tags, H1's, etc to create the semantic drill-down.
TY
I would actually need to see the domain to make an assessment... feel free to msg me privately, if you like.
It is extraordinary and thankfully it's an exception not the rule. Although Google claims that it is not possible for a malicious attack of this type to be successful, I've witnessed it first hand. Also, one just needs to read through the forums to a get sense of what is possible.
Be well!
I would go with the shorter URL structure in this case (http://www.gooverseas.com/internships-abroad/china) and use on-page key term targeting (as stated above) to help with semantic definition of the page content.
I noticed that none of the country specific pages have Page Authority (using SEOmoz toolbar) -- I'm not an expert on Drupal so not sure if the pages are brand new (they all have very recent cache dates and don't show up in the Internet Archive) -- or if the 'CTools jump menu' is preventing link juice from passing which will be difficult getting them to rank (unless they receive inbound links at some point).
So I made the SSL cert badge from Starfield Tech a simple clickable .jpg rather than a 3rd party server (which included flash... WT?) call and it cut my page load speeds site wide by 1,000ms ... pretty awesome.
Thank you guys for attempting to answer my admittedly poorly asked question. 
Another possibility is that your rankings reports may be pulling from different google data centers while geo-location and local results compound the problem. For example, SEOmoz may be pulling from different data center than your other tool - while your location will play a role in the shuffling of results as well. Although, in my experience, I believe geo has more impact on different rankings than synch issues between data centers now that Google is placing more local biz in the results up front. Run a ranking report test with both tools from the same data center as well as the same geo and review the results ...
http://www.seochat.com/seo-tools/multiple-datacenter-google-search/
Let me know if this helps.
This is simply wrong. Never take what Google says at face value. Please read the following article from SEJ and then the comments from heavyweight SEOs who've been in the business for a while:
Unless your site has incredibly strong authority and trust metrics with hundreds of thousands or even millions of inbound links (like CNN, or WSJ) then it is absolutely possible for it to be harmed by a malicious link campaign with adult oriented anchor text (for example). This has been tested and proven.
A team member is porting over documentation from a .org wiki that will be placed on the company's root domain. The problem with MadCap is that it uses frames as well as javascript navigation. Has anyone encountered this problem before?
I'm unfamiliar with the software and the project is pretty far into the pipeline at this point (I'm new at the company as well). Any advice on work-arounds or alternatives would be greatly appreciated.
HaHa, LOL 
All the skeptics ask for proof... My interaction with this thread and elsewhere is not to encourage or divulge how to operate a malicious link campaign but to quash the myth that Google says it can't be done. More double-speak from Google from a prior post:
If you find unnatural links to your site that you are unable to control or remove, please provide the details in your reconsideration request.
If you have any questions about how to resolve this issue, please see our Webmaster Help Forum for support.
Sincerely,
Google Search Quality Team
I work in the lead generation business and have witnessed first hand publisher sites that have been burned by such attacks. I've witnessed publishers inadvertently burn their own sites b/c of the velocity and volume of link growth (usually with 100% identical anchor text).
No one is going to write that case study and nor should anyone publish it IMO. 