You're going to have to give a lot more information to properly diagnose a site being indexed. Unfortunately there are a number of potential issues, if not multiple. If you're comfortable posting the domain name, that would be a start.
Best posts made by KaneJamison
-
RE: Home page (s) deindexed
-
RE: No URL for my blog?!!!
Hi there,
This isn't bad enough to completely stop your efforts, however it makes me more concerned about your website platform. Which service are you using? Have you verified with their support team that you can't put the blog post title in the URL or change it manually?
-
RE: Is there actual risk to having multiple URLs that frame in main url? Or is it just bad form and waste of money?
From what I understand, Google won't 'count' any content that is iframed on a site, so essentially Google will just see a blank page with an iframe to another site. That won't be a risk to the main domain that's in the iframe, but it's not doing anyone any good most likely.
Are they ranking for anything with these extra sites, and do they get any traffic?
I would probably see if they'll dedicate any budget & time to creating secondary sites on the better keywords, and I'd encourage them to 301 the rest. Possibly even dump some of the worse domains if they're not worth keeping, but the client might be trying to do a land grab on keywords to keep competitors out, which might be worth the annual fee to them.
-
RE: 'App Packs' Ranking Tool?
Aha - got it.
As a tip to Awestwood - you might be waiting awhile before this feature is fully rolled out. I say that because app serps are relatively infrequent, and because most app companies are probably only monitoring this across a small set of keywords (not hundreds or thousands of keywords).
You could build your own solution if you're technical and/or willing to edit some open source options. All you're really doing is setting up a scraper for the URL of each keyword search that runs daily.
You'll probably need to plan on using a couple proxies over time depending on number of keywords, and you'll probably have to update code once in awhile as Google rolls out new formatting changes.
-
RE: Schema.org for a rental site with more than one apartment per address
I'm not familiar with it. Some decent info on it at http://semanticweb.com/schema-org-adds-additional-type-property_b30861, but not sure if that's helpful.
-
RE: Home page (s) deindexed
Gotcha. Unfortunately it's really difficult to diagnose without a full range of information to look at. For example, here's a random assortment of things that might cause your home page to dissappear:
Code issues: Implementing noindex on any pages, incorrect use of rel=canonical, robots.txt updated incorrectly
Backlinks: It would take a heck of a lot of crappy spam links to get both domains entirely removed I believe. Something like that would be more likely if there were shady things being done on site, like cloaking or other activities explicitly listed in the Webmaster Guildelines: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769
Content: Is the content on both homepages unique?
Google issues: Any of the panda updates could potentially affect your site, although the homepage entirely dissappearing sounds drastic. Check through the recent algo updates here: http://www.seomoz.org/google-algorithm-change and see if any of them seem applicable to the client.
Are there any alerts in your Google Webmaster Tools area that would give you a clue?
You say that there are two sites owned by the same client, and they both disappeared all of a sudden? To me that suggests either
- (A) someone screwed up the robots.txt or noindex or canonical tags,
- (B) something shady was done on both sites, or
- (C) this is a case of user error and the site hasn't actually been deindexed, but appears to be for whatever reason.
-
RE: .US VS .COM TLD Domains
Update: The Google Webmaster Central update listed by Alan below is the format I would likely go with if you're going to have multilingual content.
In most situations I would do the .COM, and focus that on US as well as general international. I would us a subfolder such as /ca/ for Canada, /fr/ for France, etc.
If there's some weird quirk with the company, product, or distribution then I might change that, but that's definitely the minority of examples.
Recommended Reading:
http://www.seomoz.org/ugc/folders-vs-subdomains-vs-cctld-in-international-seo-an-overview
http://www.seomoz.org/blog/international-seo-where-to-host-and-how-to-target-whiteboard-friday
http://www.seomoz.org/blog/seo-guide-international-versions-of-websites
-
RE: Reciprocal links / seo satellite
It will take a lot of work on your part to build up a bunch of satellite sites that is important enough for those bloggers to care about. While it could work eventually, I think it's a lot of work when you could be working on better strategies with those sites.
Keep in mind that if there really are only about 100 of these sites, that's not many. If you go around giving them the idea that you're a spammy or low-quality brand, then they'll never trust you, and you will have burned any possibility of a future relationship.
Rather than trying to exchange link for link, try something like this:
- Offer them something else of value: write a guest post for them, interview them and publish it on your site or somewhere else.
- Trade them some of your product that they can use for a giveaway or review. If you're content-based site that isn't selling anything, then this one's off the table.
- If you already have a big social following, you could try offering them a social mention of their site: post them on your brand's Facebook/Twitter/Google+ account. This is only going to work if you have a good size following; if you have 23 Facebook fans it'll be an insult to them.
In general, embrace the concept of giving and they will all start returning the favor. Start talking them up in your Twitter feed and linking to content they're publishing and they'll notice. Then, after warming the relationship, you can start finding ways to partner with them. The link should be something that you ask for after other communication, not at the beginning.
-
RE: Addthis - Bloated + Calls other sites?
Eric is correct - this is an unavoidable aspect of calling code from third party websites.
If you want to speed it up further - you can get rid of Add This and use hard coded social sharing buttons. These typically are done in PHP or JS and load custom sharing pre-filled data for the current browser URL. There are some preprogrammed JS ones (eg this list). The speed benefit would come from loading these on your own CDN - no new HTTP request for the user and speed is only limited to the filesize and quality of your hosting/CDN instead of a third party network.
I'm not sure about the Recommended Content - if that's a paid ad unit then the site speed issue is unavoidable. This is part of the general page speed bloat caused by Ad Tech and publishers. If it's not paid - then replace it with Recommended Content on your own site using a plugin or code that is generated by your own server, not their.s
-
RE: What is the right schema.org link for a web design / developer / mobile agency?
I would also use ProfessionalService, which is a subset of the LocalBusiness item:
-
RE: Fix Bad Links in Google
I don't have an answer for all of your questions, but this Panda update may be the cause of the rankings drop:
http://searchengineland.com/google-panda-update-20-released-2-4-of-english-queries-impacted-135291
It's more likely to have affected a site based on content than links, however. Unless you're working with an Exact Match Domain, I'd guess this is the most likely cause of any significant ranking drops from the date you mentioned.
-
RE: SEO for Subdomains for different languages .com/fr, .com/es
Sorry, I can't help much re: multiple languages and how you should be tracking. I would probably track each subfolder as a separate campaign, but I'm going to defer to someone else's opinion on this question, since I don't deal with multi-language sites.
-
RE: I'm getting a Duplicate Content error in my Pro Dashboard for 2 versions of my Homepage. What is the best way to handle this issue?
You're going to want to implement a 301 redirect of http://www.accupos.com/index.php to to http://www.accupos.com/
You'd also want to add rel="canonical" tags to your index.php file to supplement the redirection, which will look like this:
Other reading:
- Read more about redirects here http://www.seomoz.org/learn-seo/redirection
- More info on how redirects work: http://www.seomoz.org/blog/url-rewrites-and-301-redirects-how-does-it-all-work
- Visit this page and scroll down to the URL Structure section and start reading: http://www.seomoz.org/beginners-guide-to-seo/basics-of-search-engine-friendly-design-and-development
- Also worth reading up on, but be careful with this one, it's trickier to implement: http://www.seomoz.org/learn-seo/canonicalization
-
RE: Automated website forms question
Hi Ricky,
We've talked through this process with 1-2 clients in the past and here's what we came up with:
-
Use GravityForms for the form itself. It has the best interface and setup of the various Wordpress form plugins I've used in the past 10+ years.
-
Connect GravityForms to Zapier. This is the simplest way to get the data out of Wordpress in a programmatic way, and Zapier connects to a number of other solutions where the contract will actually be generated.
-
Choose a Contract software solution that connects to Zapier. There are probably a few which you can see at https://zapier.com/zapbook/. Cudasign, Hellosign, and RightSignature all appear to be options. Signaturit is in beta, and Docusign (my personal app of choice) and Echosign are "coming soon." It looks like Cudasign has an option for "Create Document From Template & Send Role-based Invite." This is probably the action that you'd want to use in order to generate a new document and fill in various data points.
There will be lots of little quirks to test before you can deploy this for actual customers, but that overview of the process should get you started. Good luck!
-
-
RE: Help, a certain directory is not being indexed
As Lynn said, relative canonical tags could absolutely cause issues. That said, I'm seeing absolute URLs in the canonical tag now, so you may have fixed that in the past few days.
Also, I do see the Our Shops pages indexed when I search for site:smashrepairbid.com.au, but I don't see any other pages in the /our-shops/ directory aside from www.smashrepairbid.com.au/our-shops/?action=search
Your robots.txt is currently blocking /shops/. I don't think that would cause an issue but would be nice to remove that if it's not needed...
There's almost zero content on the pages I glanced at, eg. http://www.smashrepairbid.com.au/our-shops/1263/bakker-towing/ and http://www.smashrepairbid.com.au/our-shops/1616/coastal-towing-service/. When you look at it from Google's perspective, there's very little value being added by these pages. No unique photos, no phone number, no website, etc. There's a million local business scrapers that have more content than this, so why should they bother indexing these pages?
Try pulling up your logs and seeing if these URLs have been requested by Google's spiders. Here's a good guide from Ian Lurie on how to do that in Excel: http://www.portent.com/blog/analytics/how-to-read-a-web-site-log-file.htm
If the spiders are crawling those shop URLs but aren't indexing them, I think the first thing to do is add way more content to the pages.
-
RE: Bad links, from bad sites... how to get a solution?
Are these showing up in Google Analytics? There are a number of ways that these sites can fake visits to your site to entice you to visit their site when you see them listed in website referral traffic.
As Takeshi said, I would look to see if they're actually linking to your website or not. These sites probably wouldn't show up in Open Site Explorer because they're so low-quality, however I don't see these links listed in other Backlink monitoring tools, either.
That leads me to think it's probably referral spam.
-
Correct Hreflang & Canonical Implementation for Multilingual Site
OK, 2 primary questions for a multilingual site. This specific site has 2 language so I'll use that for the examples.
1 - Self-Referencing Hreflang Tag Necessary?
The first is regarding the correct implementation of hreflang, and whether or not I should have a self-referencing hreflang tag.
In other words, if I am looking at the source code for http://www.example.com/es/ (our Spanish subfolder), I am uncertain whether the source code should contain the second line below:
Obviously the Spanish version should reference the English version, but does it need to reference itself? I have seen both versions implemented, with seemingly good results, but I want to know the best practice if it exists.
2 - Canonical of Current Language or Default Language?
The second questions is regarding which canonical to use on the secondary language pages. I am aware of the update to the Google Webmaster Guidelines recently that state not to use canonical, but they say not to do it because everyone was messing it up, not because it shouldn't be done.
So, in other words, if I am looking at the source code for http://www.example.com/es/ (our Spanish subfolder), which of the two following canonicals is correct?
- OR
For this question, you can assume that (A) the English version of the site is our default and (B) the content is identical.
Thanks guys, feel free to ask any qualifiers you think are relevant.
-
RE: Is this tabbed implementation of SEO copy correct (i.e. good for getting indexed and in an ok spot in the html as viewed by search bots?
My general rule of thumb is that as long as all of the content is delivered via HTML (which it appears to be), and the switching of the tabs is done via javascript (which it is) than you're mostly OK.
You do have one issue though - the current code on http://seatgeek.com/miami-heat-tickets/ doesn't die gracefully. You recognized this in your notes, but if a user doesn't have Javascript turned on, they can't access the text. That's an issue for usability, and you could make an argument that it might be bad for SEO, but either way I believe it should be fixed. When javascript isn't enabled, the content should still load below the event listings. Typically that means it should load that way automatically, and javascript should then hide the tab when the page loads and show it once they click on the tab.
Ideally the content would be made easily available (currently the tabs aren't as intuitive as they are on a Facebook page, for example). Putting them above the photo might help that?
Also, from a user perspective, the written content is mostly there for SEO purposes right now. Stuff like the price stats is cool information that I would find interesting while shopping for tickets - maybe there's a way to show that graphically on the page in a more interesting way than text?
Update - I just noticed that those stats are displayed on http://seatgeek.com/miami-heat-ticket-prices in an awesome way - do stuff like that for all of your pages!
On the same tabs topic, but separate from your implementation, I've seen companies load content from an XML file using Javascript. That is definitely not SEO friendly and can cause indexation issues.
-
RE: New Social Media Site + META Tags for User Profiles
I would skip meta keywords - not worth your time to implement something that is dynamic on each page.
You could choose to skip meta description, and let the search engines figure it out on their own.
The most important question to ask yourself is, what search terms would those profile pages be ranking for, and what would that searcher be looking for? I'm guessing that 90% of people will be searching for the person's name, so your page would be showing up in the results for "Kane Jamison", for example.
Take a look at what some of the big social networks choose to display:
Facebook:
"Facebook is a social utility that connects people with friends and others who work, study and live around them. People use Facebook to keep up with friends, upload an unlimited number of photos, post links and videos, and learn more about the people they meet."
Twitter:
No Meta Description
LinkedIn:
"View Kane Jamison's professional profile on LinkedIn. LinkedIn is the world's largest business network, helping professionals like Kane Jamison discover inside connections to recommended job candidates, industry experts, and business partners."
Google+:
They use a format containing the following: "Full name - Profile Headline - Occupation - Employer - City, State"
It looks like this in practice: "Kane Jamison - SEO, Link Builder, Website Manager, Permaculturalist, and Urban Homesteader - Online Marketing, SEO & Web Design - Hood Web Management - Seattle, WA"
Quora:
No Meta Description. Interestingly they do use an Open Graph (OG) description for Facebook purposes.
If I were you, I'd find a happy medium between customized (containing name and description) and general (super concise info about your social network, similar to Facebook and LinkedIn's meta descriptions).
What type of social network is it?