Category: Technical SEO Issues
Discuss site health, structure, and other technical SEO issues.
-
One page with multi syndicate feeds
So, essentially, you have snippets from various sources that all appear on one page? Unfortunately, no, there's no way to signify syndicated content from more than one source. Both the canonical tag and the syndication-source tag can only be used once, to the best of my knowledge. If you use it multiple times, Google will just ignore the additional tags. Just picking one source and cross-domain canonicalizing should solve the immediate risks and SEO issues, even if it isn't 100% correct. Syndication is a topic Google is really only beginning to handle well, and there are a lot of gray areas that the current solutions don't cover. The other option would be to just META NOINDEX those pages, but that depends on their value and the structure of the site. Whether you NOINDEX or canonicalize them, you're basically knocking them out of search results.
| Dr-Pete0 -
Seo on a dk site
As far as duplicate content goes, you wuill be fine as long as they have differencet TLDs
| AlanMosley0 -
For Google + purposes, should the author's name appear in the Meta description or title tag of my web site just as you would your key search phrase?
Hi Lowell, To add to what Lonnie said, not long ago Google changed the instructions for adding authorship markup to your content. You can find the instructions here: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1408986 Adding your name to the meta description or title tag isn't necessary for authorship markup. But doing so is fine if you feel it will help your personal branding. Just a note - Google's use of authorship is still in the early stages. It's unclear that following their instructions will have any impact on rankings or relevancy. That said, most SEOs that I know are adding this markup because even if Google isn't using it now, the undoubtedly have plans to use it in the future.
| Cyrus-Shepard0 -
OSE - still showing me links are there when they are not?
Interestingly, up until this past Jan. 17th, each index has been completely updated. This means that the links that appeared were found in the past 3-4 weeks before the Index was published. This changed with the last update. Now it's possible to see links indexed many months ago, which makes for a larger index, but means that you may see some links in there that have dropped off the graph. So it's weird that you're finding these links, since this change was only implemented 2 weeks ago. That said, with any index it's common to see certain links disappear quickly. As for alternatives, I'm afraid there aren't many. You could try Majestic, which is a great service with a large index - but that large index comes with a price. My experience has been they historically include more of these types of missing links than Moz. (this isn't meant as a criticism - different link indexes can serve different purposes) Regardless, I'd be curious, like you, to know if others are experiencing the same issue... Please weigh in if you are seeing this, or feel free to write the help team at help@seomoz.org.
| Cyrus-Shepard0 -
Google Crawler Error / restricting crawling
Seems like you've done everything right. You could also add a Meta robots "NOINDEX, FOLLOW" to those pages. I'd also double check the referring "linked from" referrer in Webmasters tools just to make sure you haven't missed any live followed links pointing to those pages. When did you submit the removal request, and what is the status? (approved, denied, pending?) Another question, are those pages in Google's index?
| Cyrus-Shepard0 -
Checkout on different domain
In my opinion there isn't really any downside to this from a Google perspective; as you said, they shouldn't even be indexed anyway. Many many vendors out there have their charge/fulfillment go straight to PayPal for example, and don't even host any checkout specific code (other than cart-building, account creation, etc.) on their site at all. There's also the case where multiple microsites will all use the same checkout on another domain, used to centralize checkouts. As far as I know these sites aren't punished either, and it definitely saves money on the secure certificates. There is however, another angle to consider, and that is the human angle. Some people (who aren't savvy about ecommerce) might be alarmed that their secure checkout is occurring on a different domain than the one they've been browsing on. This is a 'security/conversaion' rate issue though, so you may already know of it. In my opinion I would leave it alone and not bother with the iframe tricks and so on. A subdomain might be more reassuring to the user (e.g. secure.printingofrless.com instead printingforless1.com) but I honestly can't see why the current setup would have Google implications, as long as your SSL/non-SSL pages are separate and canonicalized properly.
| icecarats0 -
Very, very confusing behaviour with 301s. Help needed!
Two problems you are experiencing and there is a common thing happening here. your deleting/not implementing the "old" redirects. as you know when you redirect page A... as long as there is a website on that domain. you have to always have that redirect. there is no threshold where if you implement a 301 redirect to a page for after a year for example there is no need for it. This is a good video i posted in the references from matt cutts. you can concluded "in a way" a 301 redirect will never be deleted especially "when Goog start receiving mix signals about this domain" so the solution to your problem is you need to implement these old redirect you had before hope this helps watch?v=QyQs3tz7ZKo
| wissamdandan0 -
Watermarking Keywords
You're most welcome Julie. This community on SEOmoz is certainly amongst the very best and is managed & moderated extremely well by the Moz community team. There are some excellent SEO guides available here, both basic and advanced, plus there are loads of awesome blog posts and videos to help those learn more about the ever-changing world of SEO and Search. Don't be too disheartened, there are a lot of true professionals out there who'd be happy to help with your Search and business aspirations
| SimonCullum0 -
Robots.txt question
Oh - and it's affect the domain negatively.. when cleaning up your site directories via robots.txt. Its actually better as I explained below
| RobMay0 -
Robots.txt versus sitemap
I would also take the time to clean up your XML Sitemap file for crawling, just in case. It'll be better for you to keep track of any files/URL's you don't want indexed by the search bots. Just good practice
| RobMay0 -
Keywords ranking are lost!!
Chitra, I deleted the other question so that this question could be contained to just one thread. You've asked this in the Q&A area, which is filled with other Pro members like you, and a few SEOmoz Associates. None of us are mind readers though, so we need some more information from you (like your website URL and keywords) to be able to help you diagnose what is happening.
| KeriMorgret0 -
H1 problem on my site not sure how to solve it
Diane, That was a nice bit of assistance from Ennovation so I thumbed it up. One note here is that for some reason, someone has made all of this too difficult for you with your CMS. We use Joomla, our clients do not have these problems. (We have one client with 6 Joomla sites who can hardly turn on his computer) Somehow you are having to deal with code when you should not be. Even if today you decided to put in a new logo image, change page titles, create new meta descriptions and change content, you should be able to do that without the need for the developer. As it is, you are on Joomla 1.5 when 1.7 is out. You are using a program that cleans up urls when 1.6 took care of any issues. You are a publisher and you should be worried about publishing. Even in my firm, my people stay after me to do what I do - no, not SEO which is what got me here - clients and strategy which is what builds a company. I only do SEO about half as much as I did even a year ago. Publishers publish. Talk with your developer and find out why you are having so many issues. If you cannot get it worked out in a few days (yes, I am serious), find a different developer. There are a lot of good developers on SEOmoz that actually reside in the U.K. (remember they do talk a bit funny). Read the ones that sound great and PM them. Engage a quality developer and make life simpler for you. Yes, if you like learning SEO, hang around with us and learn, but simplify first. All the best,
| RobertFisher0 -
404 errors on non-existent URLs
Hi Matthew, Thanks for the prompt response. Yeah, that's pretty much what I was thinking too- I know its a pretty basic aspect but I just sort of wanted someone to corroborate the process- sorry if it sounded like I was suggesting that just because content never existed there it's a reason not to 404- that wasn't my intention. Thanks again
| AJ2340