Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • Oh, if they're just pages that are part of a template you're using then you can safely get rid of them without having to redirect. Just make sure you remove all internal links to those deleted pages so you don't end up with 404s.

    | ChrisAshton
    0

  • Along with applying the 301's and using the change of address form, I'd add submit a sitemap of the OLD urls, this encourages Google to view and assign the 301s. After a few weeks, submit the new sitemap with the NEW URls and all should be well.

    | TammyWood
    0

  • Hi Satish, Then my first response is appropriate. Google will use whatever "title / Description" it feels is best, again I recommend you watch the YouTube video from Matt Cutts (former Google SEO guy) about why that is. In your particular case it could be the Doctor Deepu Chandru is driving more traffic to the site then your focused keywords, so as long as people keep engaging with Google's suggestion they will keep using it. Or perhaps you have made changes and not allowed Google enough time to re-index the site. My suggestion remove the H1 tags from the docotor's names you are using multiple h1 tags there, this could be a bit of confusion for Google, change them to H2 tags. Then under your Logo write in H1 the Page title "Liposuction Surgery and Laser Hair Removal". This will tell Google the most important thing on this page is those specific keywords. Give it 1-3 weeks to get re-crawled and Google to get its indexes changed and you should see the difference. Hope this helps, Don

    | donford
    0

  • Use your website underlaying website programming language to redirect these url to correct one. As i have did the same way to do the trick. eg: ifsc-codes.in/HDFC-BANK-LTD-CHAITANYAA-NAGAR-RAIGARH-M-P-IFSC-CODE.html so it is redirecting to http://ifsc-codes.in/hdfc-bank/madhya-pradesh/rajgarh/chaitanyaa-nagar-raigarh-m-p-branch-ifsc-code . I hope this will work in your case as well.

    | abhigarapati
    0

  • Hi Charley! We ran a few tests on the page in question and it appears that there is some Javascript on this page that is causing trouble for rogerbot, out crawler. The '#.’ portion of the URL is injecting a script. If you remove the everything after the ‘#.' in your URL, you should be able to crawl the page successfully! It’s hard to confirm this without verifying which keyword you’re optimizing the page for, but if you continue to run into problems or have additional questions please write in to us at help@moz.com!

    | moz_support
    0

  • Due to a (now-fixed) bug in Q&A, this thread is a duplicate of this thread. This thread is locked to future responses, but please feel free to continue the discussion there.

    | MattRoney
    0

  • Thanks Peter! The reposts were due to a bug we ran into yesterday. I'll lock this thread to further responses.

    | MattRoney
    0

  • It is already indexed and I can find it on the second page of the SERPs for my keyword.  I had someone else, unrelated to my site, check and they see it there too.  I've never needed to do this before, and am surprised by what happened.  Thanks again.

    | EGOL
    2

  • Yes - iso-8859-15 is very outdated encoding. Validator suggest that you should use UTF-8.I believe that this is also SEMRush issue too. Fix just wrote this: and bug will be fixed.

    | Mobilio
    0

  • Thanks. The option under settings actually hasn't been visible for me for months, but I just checked again and it allowed me to make the change.

    | KatherineWatierOng
    0

  • Are you using a CMS, or some inhouse solution? If it is a CMS, in many cases you should be able to update that CMS so that the 2 links are generated but the page itself isn't generated twice. Another option if 2 pages must exist, would be to set a canonical on both pages to the 1 main location for the content, while using a pushstate on the url to manipulate the browser into the main pathing. Although the more I think about that one, it may not be a 100% viable option.

    | RosemarieReed
    0

  • Well - this was asked many times and answer is No. Here are more info about this https://www.youtube.com/watch?v=hXt23AXlJJU https://www.youtube.com/watch?v=keIzr3eWK8I https://www.seroundtable.com/seo-geo-location-server-google-17468.html https://moz.com/community/q/does-the-location-of-my-server-effect-my-seo But if you are using generic domain (not ccTLD or SearchConsole targeting country) this could be problem - https://builtvisible.com/ip-location-search-results/ They have .com domain and targeting UK market So - if you have .de domain and in SC you set preferred geo location to Germany and your server is located in France (near Germany) then you shouldn't have to worry about it.

    | Mobilio
    2

  • Thanks for your detailed and helpful response, Peter! I went through all of your links and I found several matches between the examples provided there and our situation, especially for what concerns the domain's recency. We'll have a couple of weeks of patience to see if something in Google's SERPs is moving, otherwise we'll start trying to "force" search engines to start seeing and trust our site. And also maybe ask directly to Google for a re-evaluation. Many thanks, I'll update this post if something relevant will come up, in order to help other facing similar issue that can occur them in the future.

    | ruggero
    0

  • CDN isn't pure fun for SEO because you give links to other companies or domains. I can do http://cdn.example.com/ or http://examplecom.amazoncdn.com/ as closest example - then your both links to images goes away and isn't too good. Only way is if your site and images are on same domain/subdomain. But this require very good planning of deploying. And in result you will be "vendor locked" for cloud solution.

    | Mobilio
    0

  • It's also worth noting that our last index update was on 11/12, so Domain Authority hasn't changed since then.

    | MattRoney
    0

  • I figured .htaccess would be the best route. Thank you for researching and confirming. I appreciate it.

    | kirmeliux
    1

  • First i'm on Mac since 2008/09 and i missed Windows7,8 and 10. But i have experience since Windows 3.0 somewhere in 91/92. So longs story short - you should do routine website testing in Edge just to see that everything is correct and works perfect. You shouldn't see any impact if you don't follow their suggestions. Actual all web developers and designers have long and difficult partnership with IE. IE6 trying to break W3C standards and implement own version of them. This is described here: https://en.wikipedia.org/wiki/Internet_Explorer_box_model_bug IE7 fixes them but add new "bugs", and IE8 add new portion of them. This is described here: http://www.smashingmagazine.com/2009/10/css-differences-in-internet-explorer-6-7-and-8/ http://code.tutsplus.com/tutorials/9-most-common-ie-bugs-and-how-to-fix-them--net-7764 https://css-tricks.com/ie-css-bugs-thatll-get-you-every-time/ and in many other articles. So all devs finally was little bit pissed off and stop supporting IE at all. They just make some IE specific hacks for specific versions user just to see something almost correct. Later Webkit and Firefox join the party in CSS and add their own versions of CSS styles. Each of them start with webkit- or moz- as prefix. Opera also joins the party with o- preffix. But they fix almost all vendor prefixes within years with CSS3 and finally all agreed to abandon them. But all "vendors" comes with autoupdated browsers so they push new versions to users. You can't imagine today working with other than current version of Firefox, Chrome or Opera. Safari is different because actual version is only on current version of OSX. And in this world only IE is outdated... Just MS refuses to push new versions to old OSes. So today as Bob Dylan sing "Times have changed" everything is different. And Microsoft is trying to bring back support of all standards they lacking in past. But web here is different than web before and bringing support for IE is hard. That's why they sent mails about "possible problems". Honestly mine site works in IE 6,7,8,9,10 and 11 (tested, no joke!) and should works in Edge too (not tested). So that's why they sent you mail about "possible issues". In all web companies are jokes about IE6 or IE7 support: http://www.smashingmagazine.com/2011/11/but-the-client-wants-ie-6-support/ http://www.sitepoint.com/how-to-stop-wasting-time-developing-for-internet-explorer/ And if today Microsoft warn us about "standartds" we should ask them where have they are all over years NOT supporting standards?

    | Mobilio
    0

  • Thanks for your quick response and your recommendations!

    | seogirl22
    1

  • 1. It could certainly impact the number of pages indexed. Have you explored to see what those pages might be? 2. It could also impact your search rankings in a number of ways. Care to share the domain?

    | rjonesx. 0
    0

  • Thank you, Oleg, for your response, and thanks, Matt, for jumping in. The response was definitely very informative, but I'm still on the fence about this issue because, while Oleg confirmed what I had thought, that #2 would be the better choice, I now want to ensure that all the products on pages past page 1 will be crawlable by search engines. Any scenarios that you can think of in which search engines would not be able to read subsequent pages? Assuming that this is done using Ajax should we be okay? (Please bear with me, my specialty is link building and content, not the technical stuff

    | whiteonlySEO
    0