Category: Technical SEO Issues
Discuss site health, structure, and other technical SEO issues.
-
If non-paying customers only get a 2 min snippet of a video, can my video length in sitemap.xml be the full length?
Based on the below two facts, I would put the snippet length for your videos in your sitemap: Google can crawl the videos and may know that the actual length doesn't match what you are saying it does. Google has a published policy for newspapers that have paywalls and don't allow free full access at http://googlenewsblog.blogspot.com/2009/12/update-to-first-click-free.html Basically, "We will crawl, index and treat as "free" any preview pages - generally the headline and first few paragraphs of a story - that they make available to us. This means that our crawlers see the exact same content that will be shown for free to a user. Because the preview page is identical for both users and the crawlers, it's not cloaking."
| AdamThompson0 -
Best way to display maintenence mode on a website?
The Google Webmaster Central blog advises to use a 503 for planned downtown (and bandwidth-overruns) in their January 2011 post at http://googlewebmastercentral.blogspot.com/2011/01/how-to-deal-with-planned-site-downtime.html.
| KeriMorgret0 -
Duplicate content question with PDF
Having duplicate content within your own site is not as big of a deal as duplicate content from another site. Since you can't use meta tags, you'd have to use robots.txt to keep Google from indexing the PDFs. Nofollowing the links won't necessarily get or keep them out of the index. However, if people are linking to your PDFs, blocking them with robots.txt means you'll lose all link juice pointed to them. Something to consider, at least.
| AdamThompson0 -
The effect of same IP addresses on SERPs
Hi Guys, Thanks for the responses, I appreciate both of your points. The main reason for me to do it is increased visibility in the SERPs. The original site sometimes ranks1st and 2nd and has that position pretty much secured. It no longer requires active resources for link building and over time it will get these naturally. I ask because I have recently acquired the .com TLD and instead of just 301'ing this, I thought I could make use of it and get maybe position 3 and 4 out of it. All content is unique, all links are natural and editorial references and there is competition that could touch it (that I can see :). The question really boils down to whether G will rank two sites that exist on the same IP on the same SERP? Does anyone know if this is possible or if there are factors in place to prevent this. Thanks Ben
| Audiohype0 -
How to index our dynamic Servlets to profit from inbound Links?
Manuel, I can think of a couple good options: Use meta tags to set page to noindex,follow so search engines won't index the page, but they'll pass the link juice on to other pages on your site. Let the page get indexed, and use the canonical meta tag to ensure that only one version is indexed.
| AdamThompson0 -
How do I set up a site review for a password protected site?
Hi Sara, I checked and wasn't able to find a way for "Roger" (the mozBot) to crawl a development site (i.e. one that is password protected). Here is a post that gives more information on how your site is crawled. Maybe you could migrate the site to a public subdomain and have the SEOmoz bot crawl it there? It would at least give you some on-page SEO feedback. Good luck!
| jsturgeon0 -
Front page dropped to PR1 - thoughts?
@Frank: As to paid links, I highly doubt it. My supervisors and I are pretty militant against black hat, and I know I haven't bought any links. I can run another backlink report tonight, but it's not likely we'll find any. I have found a handful (i.e. not even half a dozen) random links that appear highly spammy, but they aren't on interrelated sites/link neighborhoods, so I don't think that's the issue.
| ufmedia0 -
Search Engine Blocked by Robot Txt warnings for Filter Search result pages--Why?
Thanks Keri for your advice
| languedoc0 -
Preserving Link Value
Assuming these companies are closely related in market you might just get away with 301 redirects. Is it perhaps possible to port the actual content on domain B to domain A (for example blogs, news items, events etc.), pretty much anything that has heavy backlinks? You might want to give the 'top pages on domain' tool by SEOmoz a go to see which pages are most important to conserve link value for.
| Theo-NL0 -
301 Redirects
I think you need to implement a 'canonical tag' on this page. Check out Rand's post on it here: http://www.seomoz.org/blog/canonical-url-tag-the-most-important-advancement-in-seo-practices-since-sitemaps
| rickdronkers0 -
Canonical Link for Duplicate Content
Yes, it might take time to get removed from search, but crediting the base page will start with the first crawl after implementing
| wissamdandan0 -
Meta Title Keywords and Company name
We do have different titled on each page, which also rank fairly well for each subject. We plan on improving those after our focus on the homepage. Thank you for your reply!
| pivotpointsecurity0 -
Is robots.txt a must-have for 150 page well-structured site?
Thanks, Keri. No, it's a hand-built blog. No CMS. I think the googlebot is doing a good job of indexing my site. The site is small and when I search for my content I do find it in google. I was pretty sure that google worked the way you describe. So it sounds like sitemaps are an optional hint, and perhaps not needed for relatively small sites (couple hundred pages of well linked content). Thanks.
| scanlin0 -
Should I move x-cart installation or 301 redirect?
I think the 3rd option is a good one to go with? Just make sure the site.com/store is optimized like a traditional homepage. Since you are not changing domains I don't think redirecting all pages in option 1 is worth it like you said. Option 2 is good because you can really control the page, its layouts, and text/links for super optimization but the work will double because it is not linked with your CMS. If I were you I'd probably choose #3
| itrogers0 -
How to use overlays without getting a Google penalty
Thanks. We've decided not to go down the no index route because although these pages don't have significant rankings and therefore don't drive much in the way of SEO traffic, they do contribute to the overall authority status of the site and the directories in which they sit. For example, a hotel deal will sit within the hotels directory/sub folder so to no index all these details we fear would undermine the overall authority of this directory. I think we are going to go with making the first paragraph visible to the users and search engines... and probably look at combining that with First Click Free
| Red_Mud_Rookie0