Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • Egol, Thank you for your advice and for your commendation. I really appreciated that. We worked hard on it and it is a fun niche to be in and lots of active and involved cyclists who appreciate this kind of piece. I will edit the properties of the document. I hadn't thought about that.  Thanks very much!

    | inboundauthority
    0

  • Hi, you need to fix this by using the rel="canonical" in the Head section of pages so that where there are duplicates they all point to one of them as the original. You can see more information here: https://support.google.com/webmasters/answer/139394?hl=en In Joomla and other Content Management Systems, duplicate pages can sometimes be an issue because of the way search engine friendly URLs are created. With these systems you can generally find an extension / plugin to help resolve the issue. For Joomla there doesn't seems to be much on offer in their Extensions Directory, but you can see them here: http://extensions.joomla.org/extensions/site-management/seo-a-metadata/url-canonicalization- I hope that helps, Peter

    | crackingmedia
    0

  • I'm having a similar difficulty on my own site - the words "Philadelphia SEO Company" are being appended to end of every post, not pages, posts only. I've ticked "force rewrite titles" on the Yoast General tab. I'm using "%%title%% " in the title template for both pages and posts on the Post Types tab. I have not touched the header.php file. Any ideas?

    | DonnaDuncan
    0

  • Hello, What Peter says is partially true, weird thing is I Google the keyword from UK and they rank on page one for me as well. I did a quick keyword research for you and here is why I think they rank: partially EMD - they use part of the keyphrase in their domain. Before anyone jumps in and tells me how this is not as powerful as it used to be please run a test for property/location keyphrases or non-sales/money buying keywords. You will be surprised. Top 10 competition is quiet weak. I do not think I would have any problems ranking this web site with a couple of links and good quality articles There are 41 web sites in Google US that contain the keyword in both title and an external anchor text as per Market Samurai - again weak competition. Hope this answers your question

    | artdivision
    0

  • Thanks. I'll contact moz. And yep, I'm using a tool called Spyder Spanker, but I'm not blocking rogerbot. I use this tool on all my websites, same configuration and all other websites return a 200.

    | sebagorka
    0

  • Hey Luca Weird, a similar issue just happened to a friend's site here locally. To clarify, http pages are still indexed if you search - site:https://www.register.it/ -inurl:https  -- but it's just that both http AND https are indexed now. Also - I don't see we.register.it loading at all - only www.register.it --- was this recently changed? Although I am seeing both we and www indexed in Google. Did we always redirect to www? Since the HTTPS is redirecting now to HTTP Google will not be able to see the canonical. Here's what I think you should try; Undo the 301 redirect from https to http In your HTTPS robots.txt file, block crawling Register HTTPS in webmaster tools (if not already) and do a URL removal on all the pages you don't want indexed from HTTPS (probably all of them) I would strongly recommend sorting out your internal linking using absolute URL paths and ONLY link to pages with HTTPS when they actually are really HTTPS pages. Otherwise links to HTTP. Sort out your we vs. www subdomain issue. This is honestly a slightly involved situation, where we can give cursory advice here, but just want to be transparent that if you're not completely comfortable in what to do, you might want to find someone who can peek behind the curtain (unfortunately as Moz Associates we're not able to log into your accounts or anything). EDIT: just want to add that you should be sure you have updated and working XML sitemaps for the http and subdomain you want indexed, and they are submitted to WMT

    | evolvingSEO
    0

  • If you have the same content in the press release as you put out there for the news sites, then it's a duplicate content issue.  Rarely does Google penalize sites for duplicate content (especially when it's just one page), they usually just choose one version and list that and then drop the others from the listings so they don't have a hundred results with the same content.  My guess is Google is choosing to list the news sites over your client's site. How authoritative is your client's site?  Did you use a canonical tag?

    | Kurt_Steinbrueck
    0

  • Hi Guys, Thanks for your help. I decided that updating the robot text would be the best option. Ben

    | benjmoz
    0

  • Thanks for the advice Kurt and I'll make sure to be careful. It doesn't look like we will invest too much more in this strategy as it is quite resource intensive.

    | JustinButlion
    0

  • Hi, Assuming that no changes were made to the page content, I would see if the drop happened around the time of a Google algorithm change by using a free tool like Fruition: http://fruition.net/google-penalty-checker-tool/. It's also worth looking at the page's backlink profile (Open Site Explorer) as any spammy links acquired may have caused the problem. I would be surprised if simply adding nofollows to some outbound links would have such a drastic impact on your ranking. George @methodicalweb

    | webmethod
    0

  • Hi Tom, Thanks for the response. We received the manual penalty at the same time Penguin 2.0 was released (Some point around May 22nd) We've got some really impressive content marketing pieces in place for the next few months so hopefully the value we'll gain through that will replace the links that were removed Thanks again

    | Sandeep_Matharu
    0

  • Hi Peter, Thank you so much!!. It make sense to me. Best Regards.

    | alejandrogm
    0

  • I'd 301 the removed page to the current page.

    | KeriMorgret
    0

  • nick I have had similar problems with "experts" in this field. Everything's expensive and is most probably not going to work out the way the expected. So far I have had more help in moz than any seo company has provided.  I just keep reading and it's starting to get better. It's no fun being screwed over repeatedly but it seems to be a right passage as a webmaster.

    | mark_baird
    1

  • Davinia is correct In my opinion. There is no reason to not want the image indexed.

    | mark_baird
    0

  • Hi, It sounds as though you have done everything right. BUT - (this is a bug-bear of mine) Google doesn't display breadcrumb rich snippet info on https pages. Also, some directories won't link to https pages (not ever seen this as a major issue because it's usually the spammy directories that do this) Amelia

    | CommT
    0

  • Anyway, you should find a solution to redirect the entire domain while "keeping" the URL structure in the new subdomain. Example: www.domain.co.uk/page1 redirects to www.domain.com/uk/page1 And for those pages that have no related or 404 you also redirect them by using a rewrite rule that redirects EVERYTHING to the new location, even under a subdirectory. Rewrite rule example (for the .co.uk domain): RewriteCond %{HTTP_HOST} !=www.example.com [NC] RewriteRule ^(.*)$ http://www.example.com/uk/$1 [R=301,L]

    | FedeEinhorn
    0

  • It's a duplicate content problem. The site's content is duplicated 100% on many other sites, including http://no1loveastrologer .com and http://archive.is/0T8BA The main site doesn't have enough authority for Google to consider it the source. Google is indexing all the content in other places so it doesn't "need" to index your client's page. Rewrite unique content on as many pages as you can and ping the site in WMT (Fetch as Google, Submit all Pages to Index) and you should see the site come back.

    | MattAntonino
    0