Category: Technical SEO Issues
Discuss site health, structure, and other technical SEO issues.
-
New domain
Yes, the keyword is on the domain. I haven´t thought about canonical. How can I do that? Duplicate the site and add the canonical to the first one? Can I use canonical to one site to another with different domains?
| mgfarte0 -
Adding Wordpress On Separate Server w/ subdomain = SEO Issues?
I would be jumping to a new host right away!
| EGOL0 -
Changing preferred domain
Way you propose is good if you can ask to change most of external links. If no, some link weight will be lost after redirect. I can propose alternative decision. You can create special subdomain or folder for country with such issues. And redirect visitors by IP. Of course in this case you should use canonical to avoid duplicate content. In result you will have primary page URL in SERP, but all people from specific country will be redirected to working pages. Cheers, Vladimir
| de4e0 -
Robots.txt for subdomain
Robots.txt work only for subdomain where it placed. You need to create separate robots.txt for each sub-domain, Drupal allow this. it must be located in the root directory of your subdomain Ex: /public_html/subdomain/ and can be accessed at http://subdomain.root.nl/robots.txt. Add the following lines in the robots.txt file: User-agent: * Disallow: /As alternative way you can use Robots <META> tag on each page, or use redirect to directory root.nl/subdomain and disallow it in main robots.txt. Personally i don't recommend it.
| de4e0 -
URL specific websites with i-framed application
With the links from the IFrame application to your primary site, I do not see it as a problem (if I am understanding you correctly). Because the content in the I Frame is originating from a single source, there would only be one link to the site. It would not be 300 separate links. Content in an IFrame can be indexed by Google, but it is indexed via the links to the source and then to the content. Such that any juice would go to the originating source and not to the site where the IFrame resides. So, that link in the paragraph you install should be seen as a single link to the site and not 300. Hope this helps.
| RobertFisher0 -
Www v.s non www
Thanks for all your responses - I will use this as the basis of my answer to the technical team.
| theLotter0 -
UK website ranking higher in Google.com than Google.co.uk
Hi there, Are you searching in Google.com from the UK, or are you actively making sure that you are seeing US-centric Google.com results? I ask because if you search in Google.com from the UK (without modifying your search to specify US results), you will receive UK-targeted results, even though you are not using Google.co.uk. I also happen to be in the UK at the moment: In order to see how Google.com ranks sites for a query in the US, you need to specify that you want US results. I use a Firefox / Chrome plug in to easily modify the URL for me: http://www.redflymarketing.com/internet-marketing-tools/google-global/ Thus, if I search for [credit cards] in Google.com, I can get this URL for US results: https://www.google.com/search?q=credit+cards&pws=0&gl=US. If I just go to Google.com from here in the UK, these are the results I get: https://www.google.com/search?hl=en&source=hp&biw=1177&bih=539&q=credit+cards&gbv=2&oq=credit+cards&aq=f&aqi=g10&aql=&gs_sm=e&gs_upl=770l1989l0l2037l12l8l0l1l1l1l271l903l3.3.1l7l0 This search results page contains a few UK brands, because even though I'm on .com, Google knows I'm in the UK. It's not totally UK-centric, of course, since there are UK brands in there too. Depending on your query, you can get different degrees of relevancy. Are you seeing true US results, or are you seeing UK-centric results on Google.com? If "&gl=US" in the query string of the Google search URL, like there is in my first example here, then you are still seeing UK-centric results. It is not clear why Google shows different results in the UK for google.co.uk and google.com queries, but word is that soon, google.com results in the UK will be the same as google.co.uk. If the site really is ranking a lot better in the US than in the UK, there is a different problem, but I thought this might be more likely. Also, have you had a look at where most of your traffic comes from in Analytics? Cheers, Jane
| JaneCopland0 -
Duplicate content issues with australian and us version of website
Hi Drew, In general, we'd say to use one English language domain to target all English speaking regions, but since you have a .com.au domain, you won't be able to use this to effectively rank in the US. It is possible to mirror a domain and geo-target each so that they rank well in their intended locations. You would set the geo-target for the US site to the US in Webmaster Tools (the Australian site is automatically geotargeted because of its ccTLD). Google reps have stated in the past that this is effective and will eliminate the duplicate content problem. Of course, Google isn't perfect and sometimes this goes wrong. We had a client with two identical websites--one for the UK and one for the US--whose UK site always showed up in the US and whose US site ranked nowhere in either territory. That's very uncommon, however. If you do this--mirror the site and properly geotarget in Webmaster Tools under Site Configuration -> Settings--neither version of the site should find itself removed from Google due to duplicate content. Keep a very close eye on each site's indexation, ranking and traffic though. Like I say, this method isn't perfect as Google has gotten it wrong in the past. Cheers, Jane
| JaneCopland0 -
301 Redirect
Hi thanks for the quick reply. The new pages will have unique content written for these pages for SEO purposes, that's one of the reasons I am doing this, however with the example.com/products page having a high PR, i thought it would be best taking advantage of this and using a 301, however after thinking about it a bit more, I think i actually might keep this page plus add the other product pages to help the keywords SERPs
| Paul780 -
Robots.txt question
It's a good idea to have an xml site map and make sure the search engines know where it is. It's part of the protocol that they will look in the robots.txt file for the location for your sitemap.
| KeriMorgret0 -
Will errors on a subdomain effect the overall health of the root domain?
Hi Greg - short answer is, it depends... Which means, of course, there's a long answer In Google's eyes, subdomains are often, though not always, treated as separate entities from the main domain, meaning if content or issues on them is giving Google trouble, that won't necessarily extend to the other parts of the domain. However, this isn't always the case. We've certainly seen times when having spam or manipulative behavior on a subdomain meant the root domain and content on it suffered as well. In the case you're describing, if the crawl issues are things like missing meta descriptions or long titles or other less-than-critical errors, I wouldn't be too worried. But if it's lots of 404s or duplicate content or infinite redirect loops, those could cause issues. If you're really not worried at all about what Google thinks of the subdomains or the content on them, you could always use robots.txt to block them from accessing it, which should cause Google to not weight that content when judging the site as a whole.
| randfish0 -
What keywords should i be using to promote my site
Along the lines of what Alan is saying, I think your site is a prime candidate to take advantage of many juicy long-tail keywords. I understand that you'd like to drive traffic to your home page, but keep in mind that the more likely scenario is you'll draw traffic through your articles which are easier to optimize than a frequently changing home page. This is a good thing, though, because people who enjoy your articles will click the home banner and read other articles from your home page. I've become a fan of many blogs this way. So how do you get more traffic to your site? Make sure you incorporate keywords into every article you write, especially in the titles. For instance, with the Michael Le Vell article, you can search for relevant keywords by simply using Google Autocomplete (see attached image). This will show you what people are frequently searching for related to Michael Le Vell. In this case, it's good that you included the word "arrested" in your Michael Le Vell article title, but it would be better for SEO purposes if the phrase was together. An alternate title might be "Michael Le Vell Arrested: On Coronation Street AND In Real Life!" There are a lot of related Coronation Street keywords you could use that can be found through the Google Keyword Tool. My advice: leverage the strength of your site to draw traffic to the entertaining articles and your readers will naturally go to your home page when they want to read more. I hope that helps! michael-le-vell.jpg
| MarcMenninger0 -
Rel=Canonical being ignored?
Hi Josh, If the canonical tags are set correctly, the SEOmoz PRO platform shouldn't throw a duplicate content error for these 2 URLs. 1. Make sure there's not a 3rd (or 4th, 5th, etc) URL reported by SEOmoz that's causing the duplicate content flag. 2. If you suspect an error, feel free to contact the help team at help@seomoz.org. Give them your membership and campaign information, along with the URLs that are causing you problems, and they should be able to help you troubleshoot.
| Cyrus-Shepard0 -
Dynamic Parameters in URL
You should decide which pages you want indexed. You should ask someone who uses Magento about additional practcal tips, I haven't used it so far. As i see parameters appear when you filter results according to price or some other technical parameter. The problem is not just the parameter in the url, but also that the title of the page reamins the same although both content and url is changed.This way you are generating duplicate titles as well. One solution might be to place canonical link elements when results are filtered as those are not specific to your search term. The link should tell google that the canonical page for filtered results is the category page itself. So for https://www.theprinterdepo.com/printer-bundles/refurbished-laser-printer-bundles.html?price=3%2C100 the canonical page would be https://www.theprinterdepo.com/printer-bundles/refurbished-laser-printer-bundles.html. Be aware with simply removing attributes because if those pages are already displayed in SERPs than you might lose traffic. Canonical would be the best solution I think.
| sesertin0 -
Coral Cdn
Once I had a project where my rankings have improved a lot by raising yslow points from 80 to 96. Cms was joomla, the design of my choice was a totally white one, to load fast by itself but even this way adding some extra craft seemed to be benefit. Since then I tend to be concerned about this issue although I haven't done nothing like testing with it since that project. Maybe I'll do again a planned improvement to see again how much to care about these factors. Thanks for the advise.
| sesertin0