Category: Technical SEO Issues
Discuss site health, structure, and other technical SEO issues.
-
What are the things that need update? before Appying SEO
Hi Falguni, This is a very broad question. To get started, though, I would recommend 1) studying Moz's free beginner's guide to SEO; and 2) installing Yoast's SEO plugin (as you have a Wordpress site). I hope that helps! Christy
| Christy-Correll0 -
Organizing A Backlink Authority Category Page
I typically advise making promotion URLs "evergreen" so you can re-use them. For example, if you know you're going to have a Christmas sale every year don't make the URL /christmas-sale-2015/. Just /christmas-sale/ will do. Then when the promotion is over you can put up messaging saying when they should check back (if it's a regular promotion) and provide a link to your main promotions page. You can just re-use this same page every year and it will gain more and more links over time and become increasingly powerful. If the promotion is a one-time deal and not something that you do each month, quarter, year... then I would 301 redirect the out-of-date promotions pages to the main promotions landing page. You can update the content on that page regularly and optimize it for things like "sales, discounts, deals, promotions, promos...".
| Everett0 -
Help me regarding Content
Sir, I made a website www. astrologersktantrik .com with all keywords but my website does not index on any new page like I say my friend website
| ramansaab0 -
Fetch as Google - stylesheets and js files are temporarily unreachable
I will investigate with our developer what we can do to prevent this from happening.
| WebGain0 -
302 redirected links not found
I checked your site with Screaming Frog - the issue with the product compare is solved. There is still a problem with the wishlist. Example http://www.stopwobble.com/wobble-wedge-white-hard-300-wedges.html - if you look at the source there are links to https://www.stopwobble.com/wishlist/index/add/product/81170/form_key/XdNIJWsLMTHVFpVh/ which still give 302 You should also correct links of type http://www.stopwobble.com/sales/order/history/ in your HTML Don't forget that Moz Crawl is only once a week - so the errors will only disappear next crawl. rgds, Dirk
| DirkC0 -
301 Redirect Expert
Hi there! We have a list of SEO companies we know and trust here, but they aren't really individual experts there. If you feel at all comfortable working on your site yourself, it's very possible someone here can give you some tips.
| MattRoney0 -
Shortening URL's
Unless you're already not ranking well, changing URLs is a big risk, honestly. I don't think you're likely to see much gain, if any, by dropping one fairly sensible word/subfolder from your URL structure, the risk and work involved to set it up with 301s and the like far outweighs the benefit, which is questionable at best to start. I wouldn't do it unless I was burning it down and starting over.
| BradsDeals0 -
SEO Question - Are 503/504 errors an issue?
Thank you for the responses. I figured it can't be a good thing, but they made it sound like it wasn't impacting user experience because the switch to the available nameserver was immediate. It still seems to me that if Google Search Console is flagging 503/4 errors daily, as well as Bing WMT, there may be an issue. If it were only Bing, Google or MOZ I wouldn't be as concerned. Heck, I even ran a ScreamingFrog scan last night and got a handful of 504 errors. Going to be reaching out to them today to see about a fix. Thanks for your replies.
| KyleEaves0 -
Duplicate content on charity website
Hi Fraser Thanks for the info! Without being too intimately involved, based upon this description - it sounds like keeping two separate sites would have been perfectly fine. It seems like there'd be least user confusion (and much easier maintenance for you) with two sites. As mentioned the duplicate content issue is more of a myth as far as causing penalties in this situation. However with that said - this doesn't mean you wouldn't want to mitigate against keeping the sites as unique as possible. You'd want to customize the content as much as you could for each site. I'd imagine because they are so different in real life, this wouldn't be hard to do - unique text, photos, design, etc. Also - from looking at the sitemap, I don't think every one of those pages would be "landing pages" for commercial search terms. That's really the only issue I'd worry about - if you did have duplicate pages that were competing with one another to rank for the same keyword - but it doesn't sound like that's the case here.
| evolvingSEO0 -
Google Ecommerce Alerts
Hi If you have an alert set up for your site and Google is sending you messages about your ndw product it simply means that Google has found the page, which is a good thing. What is your concern? Ken
| CandymanKen0 -
My video sitemap is not being index by Google
Hi Sarwan, It could be that this will not work because you use a CDN for serving both images and video content. According to https://support.google.com/webmasters/answer/178636?hl=en you have to prove ownership to the CDN hosting site before this will be allowed. Perhpas this is the same with video, as it talks about having to verify the CDN hosting site in Search Console before being indexed/crawled. Good luck with this! Anders
| AndersS0 -
Redirect Process for Moving a Blog
Thanks, Dan! That was the route I suspected i'd have to take but it helps to get validation!
| Flock.Media0 -
Re: Auto Detection of Currency based on IP & Google SEO
Thanks for taking the time out to reply Umar, and I understand your point. Just the currency sign and the amount will change - nothing else at all. Didn't understand the bit about playing with the "currency converter" (unless you mean experimenting with it to drive major onsite changes - in which case the answer is NO). Cheers
| amitgg0 -
Dynamic Url best approach
Canonical tags would be the most important thing to look at for your dynamic URLs. As long as each of your dynamic pages has a canonical tag to the static version of the page (i.e. package-search/holidays/hotelFilters/) then you won't have to worry about duplicate content.
| iSTORM-New-Media0 -
Why my website does not index?
I changed my robots.txt but still my website pages did not index please help me
| ramansaab0 -
Issues with getting a web page indexed
Thanks Umar! I guess the issue was with the canonical tag on the page, which was wrongly linked to the cached page you mentioned about. I have corrected it now. Let me wait for a day to check if the page is getting indexed. Thanks again for taking time to look at the issue! Regards Ramesh
| RameshNair0 -
Get List Of All Indexed Google Pages
Hey Max, Yes, it's possible. URL Profiler can do this for you. Check out the guide here: http://urlprofiler.com/blog/google-index-checker/ Hope this helps! Umar
| UmarKhan0 -
Accessibility / display none
Hi there! Just a heads-up, you may get more of a response if you can provide a bit more detail about what it is you're seeing. What problems are you having? What's the context? The more info you can provide, the better.
| MattRoney0 -
Disavow links and domain of SPAM links
I'm afraid there's no easy answer. The security side is beyond the scope of Q&A (it's just too dependent on your platform/host/etc.), but locking that down is definitely the biggest and first step. Obviously, though, you can't stop third-party sites from getting hacked. Disavow can be done at the domain level. There are some oddities, like Wordpress.com (where sub-domains act more like stand-alone domains), but for most sites, if most links are malicious, lock down the entire incoming domain. Make sure your core links are clean. If you have a solid base of links, and you're not dealing with a lot of quality issues, it's tough for these kinds of hacked links to cause as much harm. Google knows this happens. Unfortunately, if your core link profile is a mess or week, then it's a lot easier to take damage. So, this is a battle on two fronts - stop the attack and, at the same time, clean up your core link profile and strengthen it as best you can. There are a lot of link removal tools now, but honestly, they're a starting point. You need to dig in and evaluate what they give you, so that you're not taking out links that are potentially good. Right now, this is a labor-intensive process, I'm afraid.
| Dr-Pete0 -
Has anyone had issues with Bazaarvoice and schema.org?
Hi Justin, Some of our team picked up on this question and asked me to send you a quick response. In Schema.org markup, you only should declare one product per page, and you have two. Although the Rich Snippets testing tool will say the markup is “All good”, Rich Snippets will not show in the SERP. Then, it looks like your individual reviews aren’t nested correctly inside of the schema.org/Product tag – that could be causing an issue as well. We’ll follow up with you through our Support team! Definitely want to get this working with you.
| SimonByrneBV0