Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • In terms of SEO, there are no penalties for using a gTLD. Technically you have an equal chance of ranking well for a keyword than those with .com domains. It comes down to preference, marketing and the SEO work you put in. I can see the .training domain working well for marketing to a younger audience, it may be appealing and seem more modern. An older audience may be more accustomed and feel that .com domains are more professional. Exact match keyword domains gained notoriety because they would often get great anchor text when people linked to their website. I don't see why you would have an issue with that because I would naturally link to the site as "Foo Training." To answer your question plainly, no it won't be seen as spammy but the .com may seem more professional and credible to some users. I hope this helped make your decision easier.

    | Chris_Hickman
    0

  • It's not a good or bad thing if you have 27,000 pages that Google has indexed on your site. What's important is, how many of your pages has Google indexed, how many of those are duplicates, and how many of those are unimportant pages? To find how many of your pages Google has indexed: Make sure that you have an XML sitemap that includes EVERY PAGE on your site. Submit that to Google Search Console. Check back in a day or so, and Google Search Console will show you how many of the pages on your XML sitemap are in their index. If you only have 5,000 pages on your site, but Google shows that you have 27,000, then you're probably dealing with duplicate pages, or tags that are built by your site and have gotten out of control. To find out if Google has indexed duplicate versions of your pages: Use Google Analytics' Search Channel report (Acquisition All Traffic Channels Organic, then change the Dimension to Landing Page) to download a list of all pages on your site Google or Bing has sent visitors to. Create a new column, stripping out portions of the URL that can change without changing the content of the page. Some common offenders: Parameters Different categories earlier on in the URL www vs non-www https vs http Now, find the duplicates in the "unique" column. I like using the countif function for this. For any page that's got duplicates, find a way to fix those! It varies by the problem, but the answer is generally to 301 redirect all versions to one canonical version. To find out if you have too many tags, because they're autogenerated Use an internal tool, like an XML sitemap creator, to figure out how many tags you have Compare that to the number of tags actually bringing you traffic Check to see if you're creating tag pages that are the same content as your category pages Check to see how many tag pages you have with fewer than five products matching - those probably aren't great pages! Going through these steps should let you know if your site is well optimized - again, it's not the total count that matters! Good luck! Kristina

    | KristinaKledzik
    0

  • Guidelines are general. Each website is different case so you have to check if your markups are spammy or not. Spammy means: too much selected for structured markups improper use of markups markups in markups and so on...

    | PenaltyHammer
    0

  • Correct, hiding is bad idea, which is what i mentioned

    | DmitriiK
    0

  • Thanks Egol, Their site content is definitely good, so we're looking good on that front Thanks for your feedback.

    | Caro-O
    0

  • Yeah, these articles pretty good and have all you'd need. Still, I wonder why you want to go necessarily with cdn, rather than a hosting, which provides CDN by default. From the top of my head - Google App Engine, Liquid Web and others.

    | DmitriiK
    0

  • Thanks Rob, That really does help tremendously. Your point about Google being 2-4 weeks behind the curve makes a lot of sense, so I hope that the work we're doing at the moment will start to bear fruit. That said, I can still see a lot of merit in the points you have raised. Yes, anything you can suggest to give me more to work with would be really helpful. Perhaps we can have a conversation away from here? My skype is: sushihosting Perhaps we can have a text chat there? Many thanks again, Bob

    | SushiUK
    0

  • PM sent with the code (in case you're okay making your own .htaccess file) or a link to a zip file that has a ready-made .htaccess file if you prefer. Note: This will only work if you use Apache and have htaccess & mod_rewrite enabled... If you're on a linux host, chances this is okay but if in doubt, my contact details are in the PM. Once done, feel free to email me the list of all error pages if you still have it, I'll paste into Screaming Frog in list mode and check that all the error pages now return 301s for you.

    | MikeGracia
    0

  • Another option is if the site uses a CMS. If so, then you can create a sitemap for content pages/posts etc,. Personally, I'm with Krzysztof Furtak  on SF. Screaming Frog rocks. It'll find most pages, except perhaps Orphan pages as it wouldn't be able to find a link to crawl to discover the page. If it's really important to get as many pages as possible, I'd do the following (I've put an Astrix (*) next to ones that some people may think are a tad extreme) Run a Screaming Frog crawl Grab a sitemap from your CMS Check any server-based analytics (AWSTATS etc) Check your access_log file & parse out URLs in there**(*)** site: queries, with & without www, and also using * as a subdomain (use something like Moz's toolbar to export) As Krzysztof suggests, Scrapebox would extract data too, but be careful scraping, you may get an IP slap.(*) Export crawl data from Moz & a tool such as Deep Crawl Throw the pages from all into Excel and de-dupe. Once you have a de-duped list, as an optional last step, go back to Screaming Frog and enter list mode (I have the paid version, not sure if it's possible with the free one) and run a crawl over all the de-duped URLs to get status codes etc If you're going to do this sort of thing a fair bit - buy a Screaming Frog license, it's an awesome tool and can be useful in a multitude of situations.

    | MikeGracia
    1

  • This does seem a bit odd. but the domain also doesn't have any inbound links which is what I believe is one of the main things that Moz tool picks up. Perhaps you should build a couple of inbound links to the domain and see what happens?

    | seanginnaw
    0

  • Thanks ViviCa. Sorry for the typo in my second example. You answered my question despite it. Take care.

    | Healio
    0

  • Hi Simon Just so you know, this won't really cause a big problem in terms of Google and rankings or your site performance. There's two things you can do: 1. In Yoast SEO, go to Titles & Meta -> Other - and check off 'Noindex subpages of archives' - this will prevent subpages from even getting in Google in the first place. 2. You can set unique titles for subpages by using the %%page%% variable. Go to Titles & Meta -> Taxonomies and under Categories, Tags etc - do something like: %%tag%% %%sep%% %%sitename%% %%page%% %%page%% will output something like 'Page 2 of 4' Hope that helps! -Dan PS - Really nicely done wedding photos!

    | evolvingSEO
    0

  • Great question.  Even though it's been 3 years since this question has been asked, I feel like it's been a hot topic for the past 10 years.  No one knows the "REAL" answer, SEO's can just assume.  There are several companies in fact who even offer "SEO Hosting" as a package with unique IP addresses aka a Full C-Class IP so that every client will be on a dedicated IP. Think of it as a house.  You and your website is 1 person living in a home.  Now imagine sharing your home with 1000 people or 1000 websites.  How does Google take you seriously if you're sharing a house aka IP addresses aka page speed (bandwidth).  If you're on a dedicated IP, you are the owner of that website...No ones spammy links, gambling, porn, directory submission sites will be shared with you, and if they are, people often question if those sites should be connected to your site...even though the website names are different. I used an example from http://www.colocationamerica.com/why-a-dedicated-ip-address-is-important.htm that shows one website with a dedicated IP vs a shared IP.  If you do a reverse IP lookup on the domain with a shared IP, you get websites that have no affiliation with your website that can also be spammy and/or have spammy related links.  From what I remember, that's when you have bad neighborhood links, and I don't think your website wants to be a part of it. As far as costs, a shared IP is basically free vs. a dedicated IP is $4-10/year. Now if we're going to talk about the dedicated server vs a shared server....there's no question that if you're serious about eCommerce sales, you want to go with the dedicated server.  Prices vary from $99 up to $400 for reasonable speed.  Now as an example, if you were selling water bottles for $10 and your cost was $2, your margins (with the cost of shipping of processing fees), contribute to a $5 net profit.  If you sell between 5-30 bottles a month, there no need for a dedicated server just yet.  I would accept the fact that keyword rankings wouldn't be as high and would bank on Ebay and Amazon to stay above water.  But when you're serious about selling 30+ bottles, your sales will indeed offset the costs of a $99/month dedicated server.  Not only will the speed of the dedicated server lower the bounce rate and increase conversions of your ecommerce store, but your keyword rankings will have a significant boost when your page speed increases to a comfortable level so that your customer doesn't get irritated from the slow shared server speeds. Best, Shawn

    | Shawn124
    0

  • Brian, yes, this is the best practice. The canonical tag is essentially telling the search engines that the letter page is a duplicate of what's on the other page. So, they should give the credit to the other page. Technically speaking, those letter pages are crawled by the search engines, but since they canonical tag is there the page is not indexed. Again, this is the best practice if you're going to have the content appear in more than one location. Ideally, I would probably split it up into separate pages (a page for each term) if you can write enough content for each term to have it's own page. But, given the scenario you're outlining, this is most likely the best practice for your site. I'm asuming that the letter pages are clickable in your site's navigation and that users can click on them easily.

    | becole
    0

  • Yes I have read almost every Google guideline and post about it. I guess I will delete the description from the organization as that text is not found as such in the page. Thanks for the answer and for the help during these days!! The Kilgray Team

    | Kilgray
    0

  • Hi Brian Dan (Moz Associate) here. Bernadette and Excal pretty much nailed it. Just wanted to add that OSE, Search Console and other links tools may not always display every single link that exists out there on the web (especially OSE - OSE is the most 'filtered' index, showing mostly quality/relevant links and filtering out the most spam etc). Regardless, the best course of action is indeed to be sure your broken pages return a proper 404 status code, and Google will handle the rest

    | evolvingSEO
    0

  • Yes, do that. All hyphens and 301 from the underscored.

    | GastonRiera
    0

  • Glad you figured it out. I honestly didn't think it would have been the canonicals. I'm a little surprised that the bots didn't just choose not to respect the suggestion as opposed to blanking your site from the index. Didn't think that was even a possibility from incorrect canonicals. Good to know for the future though in case anything like this comes up with anyone else's site.

    | MikeRoberts
    0

  • Thanks, Bernadette! Will go with canonical tags.

    | Lomar
    0

  • Hi vcj and the rest of you guys I would be very interested in learning what strategy you actually went ahead with, and the results.   I have a similar issue as a result of pruning, and removing noindex pages from the sitemap makes perfect sense to me.  We set a noindexed follow on several thousand pages without product descriptions/thin content and we have set things up so when we add new descriptions and updated onpage elements, the noindex is automatically reversed; which sounds perfect, however hardly any of the pages to date (3000-4000) are indexed, so looking for a feasible solution for exactly the same reasons as you. We have better and comparable metrics and optimization than a lot of the competition, yet rankings are mediocre, so looking to improve on this. It would be good to hear your views Cheers

    | SilverStar1
    0