Category: Technical SEO Issues
Discuss site health, structure, and other technical SEO issues.
-
301 redirects & merging two sites into one
Hi Gareth, The best practice for this is unfortunately TEDIOUS. You'd need to 301 each page to the correlating page on the new domain, as well as do a 301 redirect from the root domain to the new root domain. Redirecting the entire domain won't make the internal 301's obsolete, but will only pass the page authority of the index of the root domain to the new domain. So essentially you'll lose individual page authority from any pages that haven't been 301 redirected individually. Hope that makes sense and helps. It's tedious, but there are likely some tools out there to mitigate the time involved. Thanks, Anthony
| Anthony_NorthSEO0 -
.me vs .com for new personal blog site
Thanks Phil, really appreciate the quick and helpful answer, I've only been a member a short while (and I feel like I've only just scratched the surface so far) but I've found SEOMoz a huge asset already, cheers! Nick
| NickDavis0 -
How damaging is duplicate content in a forum?
Yes, having duplicate content in your forum could hurt your ability to rank. And yes, the canonical tag sounds like the best way to deal with your situation.
| AdamThompson0 -
Do web pages have to be linked to a menu?
ok... After some thought I have figured out a way to add these links and have it make sense. At least it will be defensible. Thanks for your help.
| Banknotes0 -
301 for old domain to new domain - Joomla plugin or cpanel?
.htaccess is always the better approach when doing redirects, its easier and gives you a bit more flexibility. Good answer Kevin.
| blacey0 -
Is it important to nofollow badges in the footer that link to review sites and other branding information?
I would say these badges are good to link to and are reputable companies. Linking to the good reviews may boost them in serps for your branding search queries which is better than any bad review sites getting the spot. There are a lot of badges there and maybe you are passing out too much juice. Perhaps keep the bbb, verisign, mcafee and nofollow the rest. Just an idea.
| seoninja200 -
Nofollow link passing link juice
I know this to be false. Further - I've seen them actually improve rankings where no other link building has taken place. They certainly do not pass all value but some certainly.
| DigitalGuru0 -
Portfolio website: reciprocal backlinks and redirects
Hi Frank, First, let me answer your question to Alan. The main advantage of NOINDEX, FOLLOW is that robots can still crawl these pages and pass any link juice that flows through them. So any links pointing to these pages, whether from your own site or external sources, can return back to your site through the links on these pages. Make sense? If the search engines are excluded from crawling, then they can't discover any links on these pages that flows back to your site. Also, it seems that you would want to keep these pages out of Google's index, as they are completely unrelated to your site (unless you wrapped them in branding and re-purposed the content as a sort of portfolio) Sometimes you can get an over-optimization penalty for too many exact match anchor text links pointing back to your site, so watch the anchor text on the reciprocal links and try to vary them as much as possible.
| Cyrus-Shepard0 -
Removing irrelevant items from Google News?
Gain more relevant news press and push the older stories down...
| Mcarle0 -
Optimizing Dynamic landing pages in specific Geo-Locations
Hi Yanez, I have some concerns about the dynamic scenario and how it relates to Local Search. I actually spoke about this with Darren Shaw of WhiteSpark and he gave me permission to quote him on this: If set up properly, I think it can be fine for the search engines: BAD: domain.com/locations.php > detects IP and loads the location info into the page. OK: domain.com/locations.php > detects IP and 301 redirects to the correct location page like /denver.php. It's ok, as long as there is a crawlable list of all location pages on the site. BEST: domain.com/locations.php > detects googlebot, and if not googlebot, detects IP and 301 redirects to the correct location page like /denver.php. It's ok, as long as there is a crawlable list of all location pages on the site. This could be called cloaking, but it's actually not a concern to the engines. There are tons of examples of sites using cloaking for good, not evil, and Google is fine with it. Google has no problem with cloaking when it benefits the user and isn't used to serve different content to the engines that a user would never see. Hope this helps! Miriam
| MiriamEllis0 -
What to do if my site was De-indexed?
You're welcome Jegghead! I'm glad to help. Yes, it's frustrating that Google doesn't except html sitemaps, mine errored on Sunday and I dropped about 75 rankings on several keywords I was in the top three rankings for in Google for a while now. I'm trying to figure out why that has happened and if I can fix it. We are practically neighbors! My website is www.integritymcseo.com I have been trying to find local online marketers to network with.
| brianhughes1 -
Best practices for migrating an html sitemap? Or just get rid of it all together?
any thoughts on the impact of removing the internal links in the sitemap? will this hurt our domain authority? Or given the low amount of links compared to our whole link profile, is it not that significant to cause concern?
| BostonWright0 -
Google places page where is my additional information
Hi Bristolweb, In July of 2011, Google made a significant change in the information they actually display on the Google Place Page. You should still fill everything out that you can, but Google is only going to publicly display part of it. Rest assured, however, they see everything you've input. Here are 2 great Mike Blumenthal articles on the changes: http://blumenthals.com/blog/2011/07/21/google-places-testing-new-layout/ http://blumenthals.com/blog/2011/07/26/google-places-what-else-went-missing-on-the-places-page-in-the-update/ Descriptions have returned in most cases, but other things like the old additional info and citations are no longer part of the public Place Page. Hope this answers your first question. Optimizing a Google Place Page hinges on 4 key things: 1. Adhere to the Google Places Quality Guidelines to the letter: (http://support.google.com/places/bin/answer.py?hl=en&answer=107528 *Note, this is the North American link. If you are located elsewhere, your guidelines may be slightly different. You should find them and memorize them and check back frequently because they do change). 2. Choosing the best 5 categories for the business 3. Writing a good description. 4. Being clear about the business model. Does it serve clients at the brick and mortar location, in which case you may show the address, or is it a go-to-client business model that must hide the address? Once you've done this, gaining reviews and citations are the typical SEO work ahead of you. In my work as a Local SEO, it never fails to amaze me how many local business owners make this whole thing worlds harder for themselves than it should be because they don't read the guidelines. My rule-of-thumb advice is that the most important factor of an optimized Place Page is its lack of violations. It's also important to remember that, these days, the strength of the website is the top overall ranking factor in most local results. Hope this helps! Miriam
| MiriamEllis0