Category: Technical SEO Issues
Discuss site health, structure, and other technical SEO issues.
-
What should I do with a large number of 'pages not found'?
I would leave the pages up but mark them as "no follow". When I worked in eCommerce, this was a great tactic. For UX purposes, you could try to steer people to similar-products, but keep the originating page as "no follow" or "no index".
| HillyCE0 -
Site-wide Links
Hi If your users are clicking on the links then yes keep them, but make them no follow, however if they are not driving traffic to your other sites and don't add any real value I would recommend removing them. However this article is very similar to a question that was asked last year and instead of duplicating everything that was said here I thought I would just post so that you can read the comments http://moz.com/community/q/are-sitewide-links-bad-for-seo (we know Google doesn't like duplicate content)
| Andy-Halliday0 -
Blog Location
Hi Sara, If you are keen to get more traffic through the news stories and the main blog link, it's definitely a mistake to move it, just from a usability perspective... I agree with Andy that stats like you're seeing for these links in Analytics suggest that you can make an improvement to the page to increase the click through rate if. The effect of a link being above or below the fold is probably negligible when it comes to PageRank passed (with the exception of footer links, which appear to be devalued).
| JaneCopland0 -
Cloud Hosting and Duplicate content
you can set up a CNAME record to be used with CloudFront on copyfaxes.com. It could be something like images.copyfaxes.com if you are only using cloudFront to serve images. I serve all my media content and scripts via WPEngine's CDN and use a CNAME to suggest all the content resides on my subdomain eg: cdn.bestpremiumthemes.net/wp-content/uploads/2014/04/Input-Form-Designs-100x65.png
| Saijo.George0 -
Duplicate content on user queries
Is it possible to leave the products and just disable the buying function for them? That is the best bet SEO wise. Then your pages keep their authority.
| LesleyPaone0 -
Sitemap & noindex inconstancy?
The site map is an indication to Google to crawl those pages, there are instances where people have meta tags with noindex, follow and would list them in their sitemaps so that Google will crawl all the links listed on the page but not index the page itself. The meta tags or headers on your page will be the signal to Googlebot on how to handle that page regardless of your sitemap and whats on it.
| gazzerman10 -
GWT shows 38 external links from 8 domains to this PDF - But it shows no links and no authority in OSE
Hi Dana, I'm inclined to think that this is a problem with how the tools process links to and from PDFs rather than the passing of PageRank through PDFs itself as well, although I am not sure. I would certainly not say no to a link to or from a great PDF if that was the only option, although I have also actively encouraged clients to re-publish their quality content housed in PDF format as HTML and redirect the PDF file to the new source. We're just a lot more sure about how authority passes through HTML files.
| JaneCopland0 -
Moving to New Domain - Ranking impact
Hi Conrad, Unfortunately every migration is going to yield different results in terms of how well the redirection goes and how long you have to wait to get your rankings back if they suffer. Thomas' experience is fairly typical (and the resources he cites here are good too). It's impossible to say what will happen - a particularly large site (let's say a big e-commerce site for a high-street retailer) might suffer due to the sheer volume of URLs that need to be moved and picked up by Google; a small website may have an easier time. However, metrics like the age and authority of the moving website may well play into how successful the move is as well. As such, it's really hard to say exactly how a migration will go without seeing the sites (and even then, ranking problems can crop up that were unexpected). Cheers, Jane
| JaneCopland0 -
How to fix google index filled with redundant parameters
Thanks again Alan. I've checked the site with screaming frog and it doesn't return any url's with parameters so at this stage I might be ok. I am getting a message in webmaster tools saying "severe health issues" but it doesn't appear to be affecting the urls I want to keep. I'll likely remove the entry once things have cleared up some more. Thanks Jeff At the moment I'm stuck with Zeus web server (insert expletives here) so no htaccess file or I'd be in a better position. After messing around with it and very limited documentation I can only get the site operating with index.php in the url but with SEF url's for the remainder of it. I'm investigating migration to an apache server so that might make it easier. Regards Ian
| iragless0 -
How to fix Google index after fixing site infected with malware.
Thanks Tom That's a good point. Part of my problem lies in the number of URL's with parameters (thousands). Applying status codes of any type isn't really viable. Starting to see the url's clean up with the addition of the entries in robot.txt. Regards Ian
| iragless0 -
Moving Some Content From Page A to Page B
but as long as I have a lot of great pictures indexed by Google wouldn't it be a waste to noindex such pages, even though I do not care about ranking for those pages? In other words, if I have "noindex, follow" wouldn't that mean Google will not count all the pictures on such pages and that could ultimately hurt more important pages on my site? I want to move these pictures to other page where the pictures do not load before users scroll. I will share with my web developers what you mentioned above. If you can think what the javascript is please do keep me posted
| khi50 -
Manual Webspam Error. Same Penalty on all sites on Webmaster Tools account.
As per your example above.. I have enabled auto syndication to all my social network.. so only excerpt will syndicate along with linkback to original content..if you open any of those 50 links it will be facebook post. So sharing content is not against webmaster guidelines.. Already started editing and rewriting of post... It will take more than a week..
| ndroidgalaxy0 -
How to fix keyword cannibalization?
Hi there, A relatively old post on this from the Moz blog might be of use: http://moz.com/blog/how-to-solve-keyword-cannibalization Have you done any link building to the primary vinyl banners page? When a range of different pages rank for one query, it can often indicate a problem with the strength of the primary page, with site hierarchy not favouring that page as the canonical source of generic information about the topic.
| JaneCopland0 -
So many internal links to the same page
Hi there, Nope - this shouldn't be an issue, as Marty says, as long as the instances of you linking to this page multiple times are natural and necessary to the content. Avoid using too much optimised anchor text with these links, but Marty is also correct that linking with anchors like this is very common and a good way to point to content within a longer page.
| JaneCopland0 -
Hello i want review of my site.PLease have a look
I am no authority, yet it does seem a bit all over. I like the social media links on top, and a call to action type of newsletter sign up. But with the domain name, and website at first look, I would be lost myself. The colors are easy on the eyes, and easy to navigate. Just my thoughts
| Berner0 -
Remove Directory In Webmaster Tools
This might do what you want http://apps.shopify.com/power-tools-bulk-edit-tags
| BlueprintMarketing0 -
When to use mod rewrite / canonical / 301 redirect
Hi I would definitely setup a 301 redirect for the case sensitive URLs issue - you may already know how to setup this up but I thought I would include this link - http://webmasters.stackexchange.com/questions/18670/how-to-redirect-any-url-to-lowercase-url For your pagination issue you need to look at if it is possible to change /casestudy to a "view all" page and then place a canonical link to the /casestudy page in the of each page in the paginated set. Another option for your pagination is using rel="next" and rel="prev" and for the filter parameters you could tell Googlebot not to crawl these parameter URLs. Hope this helps
| Matt-Williamson0 -
Need Help With WWW vs. Non-WWW Duplicate Pages
Hello! Is it just you have both www and non-www versions of all pages that are resolving? If so you can add one 301 redirect rule in IIS to redirect all of them from one to the other and solve the problem. If not, feel free to provide more detail and I or someone else can chime in. EDIT I just took a quick look and it looks like that's part of the problem. Follow the above and it should take care of it. I also noted the non-SSL version is 302 redirecting to the SSL version. That is an incorrect implementation. You want that to be a 301 so if someone links to the non-SSL version you get credit for that link juice. Cheers!
| mosquitohawk0 -
Screaming Frog Content Showing charset=UTF-8
Greetings Pamela! This is nothing to worry about at all. UTF-8 is simply a type of character encoding and is set in the websites to instruct web browsers on how to interpret the character encoding. See- http://screencast.com/t/s4I2RNsgqUh As it's not negative and perfectly normal, there's no need to change it at all.
| mosquitohawk0 -
Proper Schema usage for service based businesses?
Hi Imedia, I agree with Sam: don't use this for your location-less service areas. Use it to highlight your physical location.
| MiriamEllis0