Category: Technical SEO Issues
Discuss site health, structure, and other technical SEO issues.
-
I always get this error "We have detected that the domain or subfolder does not respond to web requests." I don't know why. PLEASE help
Thumbs up to Kane's advice. If you checked with your host and 'everything was fine' but 10 minutes later the site was working again, you might want to get an uptime tracker
| JaredMumford0 -
Duplicate content due to csref
Yes, to set up rel-canonical properly, every page that could conceivably be tagged with a csref= parameter should have a self-referencing canonical. The tags are easy to set up, in theory, but once you get into a large site and/or CMS, setting them up on dozens or hundreds of pages can be tricky. Ultimately, it's a more effective approach that has some other benefits (like scooping up stray duplicates that may have been created by other URL parameters), but it really depends on your development resources and how complex your site is.
| Dr-Pete0 -
I know I'm missing pages with my page level 301 re-directs. What can I do?
It really depends on the platform you're on and the way the page level redirects are set up, but if you list all the rules for the existing pages, you can always add a redirect at the very end. If implemented properly, anything left over should just that rule. The alternative is to build a custom 404 handler that actually implements a 301-redirect to the new site. I'd agree with this post, though - if the content really is dead, in some cases, it's better to let it 404 - http://www.seroundtable.com/archives/022739.html If you're really starting over, and for pages that aren't very active (no links, very little traffic), it can make more sense to clean things up. There's no one-sized-fits-all answer - it depends a lot on the scope of the site and the nature of the change.
| Dr-Pete0 -
301 redirect blog posts from old URL to new one
Redirection is a good plugin, though I've had two cases where it interacted strangely with another plugin, in which case I used Simple 301 Redirects successfully w/o interaction issues. That said, Redirection provides some more advanced options like redirecting based upon the referral site, which has come in handy.
| KaneJamison0 -
Bad reviews coming next to the company website, how to remove those ??
That's a great point about social media. Often, social media sites will have a much greater authority than the review sites. If you build up a strong online presence on even the basics: Facebook Twitter YouTube Google+ Pinterest Vimeo Then you will be pushing the bad reviews further and further down the results. Even better than that, you will be showcasing the solid community that you have built up. People will go on these social sites and learn more about you and if you are communicating well then they will surely like you. It will give you a great competitive advantage.
| intSchools0 -
How to optimize achor text links on ecommerce category page
This pretty much mirrors what I was going to say.... "I can't find an example of how it would be coded (check link above!), but you could look at having a single ahref stemming from the product text, and expand the clickable area to include the image. This would get around your issue, and I've always thought that having multiple links to a single product like that is sub-optimal. I'm not sure how much added SEO benefit this would have, but all the small things..."
| AndrewAkesson0 -
Google suddenly remove all my keywords?
unfortunately, it is not must that Google will send you an email.. you have to keep track of your links yourself as it is your baby you are building!
| MoosaHemani0 -
Is it better to have URLs of internal pages that are geo-targeted or point geo-targeted links to the homepage?
Actually I created a separate thread for this question. I figured I would keep it organized. here...
| Cyclone0 -
301 Redirect for 3 Domains into 1 New Domain
If I understand your question, you cannot redirect abc.com URLs from an htaccess file on ghi.com. That directive has to be placed on abc.com's htaccess file. So the abc.com htaccess would show the specific redirects to ghi, ie: redirect 301 /wines.html http://www.ghi.com/wines def.com would also have an htaccess with redirects for each landing page: redirect 301 /trade.html http://www.ghi.com/trade ghi.com would not need any redirects since thats the site you want people to land on. Unless you want all urls on abc.com to simply redirect to the root www.ghi.com, you have to write a redirect for every page you want to redirect (or use advanced code to rewrite every landing page to the new domain).
| JaredMumford0 -
Mod rewrite question
I've found a solution and am posting it here in case anybody else is having the same problem: RewriteRule ^([0-9]{4})/([0-9]{2})/([^/]+)/[0-9]+ /blog/$1/$2/$3/ [L,R=301]
| ahirai0 -
Site-Wide Header image w/ Link hurting me?
Thanks for hte help, but it seemed it was just a google ranking spike for some new content and now things appear to be returning to normal
| choiceenergy0 -
Rel = prev next AND canonical?
If these problems are occured due to product category parameters here solution Cross check results with: **site:mydomain.com inurl:product ** If you wanted to block a category parameter from being crawled it might also be useful to have them organized into separate directories.If you wanted to send a spider like Screaming Frog or Gsite Crawler through a section of the site to create segmented sitemaps so you can see which major parts of the site are having indexation problems. it would be much easier to do so if there was a directory you could ask it to crawl down from. **Rel next / prev or View All Canonical - **Fine for pagination when used correctly but won’t solve most faceted navigation issues. Rel Canonical - Helps with same facets in different order (e.g. blue/small/cheap Vs small/cheap/blue) But not designed for different facets (e.g. blue/small/cheap Vs blue/medium/cheap)
| EastEssence220 -
Google Plus Places Error
Wait for a week or so, if it's a valid address Google will fix it, it just takes some time.
| irvingw0 -
Disallow: /search/ in robots but soft 404s are still showing in GWT and Google search?
You could also look at using the meta robots = noindex tag on /search/ pages, rather than just blocking it in robots.txt, as this will remove existing URLs from the index.
| timhatton0