They weren't critical as the page was able to load when scripting was turned completely off, but in the WC3 validator there were only a dozen errors so it shouldn't be too hard to fix them.
Posts made by RyanPurkey
-
RE: Wc3 validation is it still that important
-
RE: How long does Google take to re-cache a site?
Have you submitted a sitemap for the new site and do you have the site attached to Google Webmaster Tools? You should have more insight into crawling if you do so.
In the meantime you can use a text browser like the one here: http://www.seo-browser.com/ or at http://whois.domaintools.com/ or here: http://www.linkvendor.com/seo-tools/se-spider.html to get an idea of what a spirder sees.
-
RE: Wc3 validation is it still that important
If the scripting errors cause crawler and indexation problems, than they're problematic, but script errors like a lot of things have shades of gray. Some are really bad, some not as bad, and some benign.
WC3 validation is nice though as it's helping to insure the quality of the network in general.
-
RE: Will becoming a YouTube partner get me higher rankings on the site?
Ya. Check out the details here: www.youtube.com/t/partnerships_benefits
-
RE: Do you want an SEOmoz profile badge?
Like it, but am also in agreement with others recommending that it have more weighting towards the quality of answers some how.
-
RE: Tool/Method to find users on Twitter from a CSV file
Since you're looking for users and not necessarily tweets, you can scrape the company names you're interested in via a spreadsheet, and then append this URL to each unique name: http://twitter.com/#!/search/users/
Just getting rid of duplicate company names should cut down the list some, but it's still going to be a tedious process. At least with the spreadsheet you can add further columns to prioritize your work and go after the companies that are most applicable.
To cut the tedium, Mechanical Turk could then process the results for you fairly quickly.
-
RE: Dofollow Blog Comments
If you're planning on doing this with a non-disposable, branded website it's a bad idea as there are can be lots of negative effects.
If you're planning on being black hat, having a network of disposable sites, and are going to be masking (or trying to mask) everything you do, it's just another tool in your toolbox.
Like people already mentioned here, it's better to make worthwhile comments that can be traced back to a reputable looking source as those could even have the added benefit of bringing you additional business, not just a boost in the rankings.
-
RE: Will becoming a YouTube partner get me higher rankings on the site?
It'll give you more promotional options and data which in turn will get you more exposure. That alone could get you more views than Youtube rankings in the conventional sense.
-
RE: SEO Triage - What matters most?
If I was only giving 8 hours I'd spend it all on project mapping. The work a site requires to be effective is way more than 8 hours but you could construct a broad project road map that would lead you consistently in the right direction for a year or more.
With the above context I'd be working on increasing domain authority and auditing site architecture to get the most bang for the buck. Steve's suggestions are spot on.
-
RE: What's the oldest "blank" on the Internet?
Here's a back story on the oldest Networked email: http://openmap.bbn.com/~tomlinso/ray/firstemailframe.html
According to Wikipedia and itself, http://symbolics.com/ was the first registered .com domain name and is still active.
Someone could argue that the oldest active link is a telegraph line that's still hung and running somewhere, but DNS in general would be a better, and more precise still the invention of HTML during 1990-93 with anchors.
Tim Berners-Lee's HTML editor screen shot: http://www.w3.org/MarkUp/tims_editor
And HTML in general: http://www.w3.org/History/19921103-hypertext/hypertext/WWW/MarkUp/MarkUp.html
-
RE: How Fast Is Too Fast to Increase Page Volume of Your Site
Sometimes you see a bit more organic (in terms of people just naturally linking to things) indexation of content if pages are dynamically generated by a user selecting the amenity, location, etc, and then the database creates the page on the fly and then someone links to that. Still it's nice to have a bit more control over page URLs and give them a bit more established, static presence ahead of time.
I wouldn't worry about their release in regards to amount. Just make sure they function, have sitemaps indicating the new content, and try to drum up a press blitz on the new amount of pages in social and conventional space. Google does well with large increases if it sees a correlated increase in search or press.
-
RE: Using the PA metric from the mozbar
The link might not be in the OSE index which is where the PA data is coming from. If you do have a clean link pointing to the page you can be pretty sure there's some link juice flowing to it, but it will take time for the site to be crawled and that link strength to be added.
-
RE: Weird situation with our local listing.
Solution #2: They should hire you as their SEO.

-
RE: Beaten in SERP's by a site going 'all in' on 2 keywords in their anchor text profile.
Right. The more competitive the more shuffle you'll see.
-
RE: Weird situation with our local listing.
Since you're in real estate I would sell your office to ABC Realty for a hefty fee due to your office having prime Google SERP location.

Slightly more practical: If the corporate office submitted a bulk listing for its real estate agents and their own location make sure this is setup properly (http://www.google.com/support/places/bin/answer.py?hl=en&answer=173669) Or, they might not have done this at all, and Google is now considering you as the corporate office.
-
RE: Managing 404 errors
Also, be sure to have a user friendly 404 page. 404 is unavoidable due to typos, silliness and random acts of God, so it's always wise to have a highly functional page as a catchall for anything that you can't 301 redirect.
Examples
http://www.apple.com/gljasdlj
http://pages.ebay.com/gljasdlj
http://www.cnn.com/gljasdlj -
RE: Beaten in SERP's by a site going 'all in' on 2 keywords in their anchor text profile.
Keyword stuffing is rarely a good idea as it's both annoying for a reader / user of a website and a commonly flagged spam technique.
1.10 is a tricky ranking because you do get some boost from being at the bottom of the page but there's a high likelihood of getting shuffled out of that position. I 'd recommend not worrying so much about who is currently in 1.10 and go after the link profile of the top 5 in that SERP. It sounds like you're in a good place in terms of conversion, so really it's a matter of building your brand / backlinks.
-
RE: How Can I move a site higher in Google Places?
If the order reversal includes shuffling the place name away from "city, ST" you can see dramatic differences.
-
RE: What are the differences between Google SEO and Bing SEO?
I'd take a "Good for Google AND Bing" approach and focus on getting links from a more diverse set of root domains as that will hep rankings in both. Shortening your URLs may help, but it's also going to require changing your site, redirection, canonization, and all the things that go with that. If you have any advertising data via Bing, that can also help you with some content creation that may do better there (in terms of conversion). Again, even on my sites that should be better tailored to the niche's Bing has been growing in, Google is delivering the lion's share of traffic, like 20 to 1.