Category: Technical SEO Issues
Discuss site health, structure, and other technical SEO issues.
-
Why would I suddenly start seeing a spike in hits from particular bots (specifically rogerbot, google, bing, and yahoo)?
Hi Kasy, did you get to the bottom of this?
| CraigBradford0 -
Trackback URLs & temporary re-directs
Hi Lewis, Can you share an example? Am I correct in thinking you mean external links from your blog posts (trackbacks) are going through 302 redirects? Craig
| CraigBradford0 -
URL path randomly changing
If that was the final product page, then yes, you should be using a standard htacess rewrite command to ensure that the final product urls are always www.domain.com/productx But in saying that, the way you had it is totally fine if all the other url possibilities have a canonical tag that points back to optimised version (original product url - www.domain.com/productx) The htacess rewrite it's not something you should be handling manually. Magento has that option inbuilt into it. It would be a fair amount of work if you had to do that manually. and I would just run with the canonical option if that were the case. Any good eCommerce platform should have the inbuilt ability to automatically remove the category folders and other search queries from the final product url. Sometimes it's ok to leave the category folders in the url, it just depends on the products being sold. Below would be an example where I would leave the category folders in the url if I was selling different colored soccer balls. www.sports.com/soccer-balls/black-white/ www.sports.com/soccer-balls/blue-white/ www.sports.com/soccer-balls/red-yellow/
| Dezzign0 -
What are the negative implications of listing URLs in a sitemap that are then blocked in the robots.txt?
I highly doubt it would effect rankings due to low quality issues but it will show that you have site map error warnings in your GWT console. That issue is technically classified as 'Warnings' and not 'Errors'. The right thing to do in that scenario is take the robots.txt block off and just use a 'noindex' tag on the pages. That way they can stay in the site map but they won't show up in the index. Otherwise you should remove them from the sitemap if you don't want the warnings in GWT.
| Dezzign0 -
Invert canonicals?
If you change signals, Google will pick them up. They much prefer to see change than content being incorrectly handled. -Andy
| Andy.Drinkwater0 -
Transferring a site to wordpress and its effect on SEO
even after correctly redirecting it could take a few days to a few weeks to have your stable position back. So if your business would hurt extra in summer, do not move until after the season. (appart from the risk you are taking if you or your webbuilder does not have the experience to do this correctly)
| Stramark0 -
Similar pages on a site
I'm simply going to re-emphasize what others have said here: it's more important how similar the content is than anything else. "Jumpers" is certainly broad enough that you can attack it from several different content angles. If your website sells jumpers, it's not unusual to have multiple pages about jumpers. The key is that every page should serve a specific purpose. If this isn't the case, work to find ways to either: Consolidate or Make the purpose of each page uniquely valuable. Hope that helps! Best of luck with your SEO.
| Cyrus-Shepard1 -
What's going on with google index - javascript and google bot
Hello Or, I just checked the most recent cache and it looks like Google does NOT see the content on the first URL (ending in /71232/) but does see it on the second one (ending in 69811). This is the opposite of the situation you described above. Yes, Google "can" execute Javascript, but just because they can doesn't mean they will every time. Also, perhaps not all of their bots can or do execute Javascript every time. For instance, the bot they use for pure discovery may not, while the one they use to render previews may. Or they could have given the Javascript only so long to execute. I also notice the page that is currently not indexed fully has an embedded YouTube video. Not that this would typically cause any problems with getting other content indexed, in your case it may be worth looking into. For example, it could contribute to the load time issue mentioned above. When it comes to executing scripts, submitting forms, etc... Google is very much at the stage of just randomly "trying stuff out" to "see what happens". It's like a hyperactive baby in a spaceship just pushing buttons like crazy, which is why we run into issues with "spider traps" and with unintentionally getting dynamic pages indexed from form submissions, internal searches and other oddities in site architecture. It is also one of the reasons why markup like Schema.org and JSON-LD are important: They allow us to label the buttons so the bot "understands" what it is pressing (or not). I apologize that there is not definitive answer for your problem at the moment, but given the behavior has switched completely I'm not sure how to go about investigating. This is why it is still very much a best practice to ensure all of your content is indexable by not rendering it with Javascript. If you can't see the textual content in the source code (as is the case here) then you are at risk of it not being seen by Google.
| Everett0 -
Sitelinks only show when the URL is searched- Why don't they show when our company name is searched?
Google is fickle when it comes to sitelinks. There's not a lot you can do to control them aside from telling them which pages NOT to show for certain URLs in Webmaster Tools. There's definitely a correlation between your highest-authority pages and pages that appear prominently on your site by default. The only thing you can really do to get site links is to prove to Google that you're a brand and that you appear more prominently in a search for something like your brand name. It's less a technical matter and more a matter of spreading the word about your brand. Links, shares, online mentions, and real-world awareness branding actions are all likely to help.
| Carson-Ward0 -
Deal that expire what should i do?
Thanks mate. That was a very informative answer. It's not that I am making some hand made items as the example of Matt was but I have a very small amount of deals that come and go every now and then. I will not redirect or give a 404. I think that I will keep the page but explain that the deal is over and that there are more deals relevant to this one. Forcible redirecting in my opinion is the worst in every situation except if the intent of the next page is Exactly the same as the previous which 99% of the times is not. 404 could be ok but the deals i offer are hard won and i dont want the traffic to just go into a 404 wall. Adding the relevant deals seems like the best way to go. Thanks again
| Angelos_Savvaidis0 -
Server 902 issue
Hi there According to Moz's Errors in Crawl Reports: "902 Unable to contact server The crawler resolved an IP address from the host name but failed to connect at port 80 for that address. This error may occur when a site blocks Moz's IP address ranges. Please make sure you're not blocking AWS." Here's a Q+A thread with resources and tips, as well as input from Moz Staffer Sam Weber. Hope this helps! Good luck!
| PatrickDelehanty0 -
Rel-canonical and meta data
No problem at all Ben. Good luck with the work - been there many times -Andy
| Andy.Drinkwater0 -
Issue with Title of Homepage - Wordpress Platform
The boys above have detailed the more technical aspects - the common issue is the length of the title. 1 pixel over 512 pixels and the title truncates as you have described above. I have seen it truncate as you have described from 487 pixels. I tested your example and that is 580 pixels. XYZ Company: Primary Keyword in Ontario Canada - XYZ Company Hence on your example to reduce the pixel width and also for a better CTR I also recommend you use the pipe | and not the hyphen -. Plus you may need to extract Canada etc. So confirming have you checked the Title is 512 pixels or less? If still truncating test 487 pixels to be safe. I built my own title tester given it is such a common issue so I could measure the pixels https://www.predikkta.com/products/free-serp-optimizer-tool.html Hope it is as simple as outlined above. I add it is worth getting right as it has a strong influence on CTR.
| ClaytonJ0 -
Dup Title tags
heheh well I'd love that 47 not found could be "lots" by the way, where are thos links coming from? What you have to bear in mind is that it's not the same if the 404 is caused by internal or external linking. If internal, then you have to see which page is creating the 404 and fix it If external you have to ask the webmaster for fixing the link and meanwhile have the 301 in place so you can save the value of those links..
| mememax0 -
Referencing links in Articles and Blogs
Not 100% clear on the question. But I think what you are asking is "Does Google recognizes footnotes, etc?. which is where superscript is often used. The short answer is yes, as I do not believe the size of words would have any factor for google in search. Referencing could only be a positive if only from a semantic perspective. Hope this helps.
| ClaytonJ0 -
Soft 404's on a 301 Redirect...Why?
Mememax, thank you. I did not know this. Have you tried the Custom 404 Widget?
| EGOL0 -
Utilising Wordpress Attachment Pages Without Getting Duplicate Content Warnings.
Hi, Are you using Yoast for your SEO? I would have a look at the Post Types and Taxonomies in the Titles and Meta's section as you can set rules for noindexing, if you wish. This would certainly allow you to have the attachment pages but noindex them. Alternatively, install / make use of the gallery functions on Wordpress. Not knowing your site, I don't know if this would work for you though. -Andy
| Andy.Drinkwater0 -
How can I use MOZ to investigate a my recent drop in domain authority?
Patrick nailed it, but I'll add that when you see a drop in DA, the first thing you should do is look up your competitors in Open Site Explorer. If they've dropped too, then it's very likely you're just seeing the effects of an index refresh.
| MattRoney0 -
When is Duplicate Content Duplicate Content
We actually have 2 nearly identical ecommerce websites, with identical product descriptions, and prior to us doing a cross-domain canonical, we would frequently have both sites ranking in the top 20 for a keyword. The cross domain canonical really helped - massive rankings increase.
| AMHC0