Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Web Design

Talk through the latest in web design and development trends.


  • Nicholas, do you recommend any particular heatmapping tool? We've used Crazy Egg for years and I have no complaints, but I've been tasked with evaluating other options. Thanks for any insight you can provide...

    | matt-14567
    0

  • If you set up a reverse proxy properly, the search engines will treat the content in www.website.co.uk/blog/ as normal content on that main site. It's not trivial to set up, but it's designed for specifically this kind of scenario (e.g.  a subdirectory displaying content using a technology that can't be installed on the main-site server.) You'll also want to make certain things like cookie persistence, Analytics etc have been accounted for. Hope that helps? Paul

    | ThompsonPaul
    0

  • Yeah, I'd say that, if Google is typically indexing the "www" version, then you're probably ok, but you can't guarantee that. 301s/canonicals should help, too.

    | Dr-Pete
    0

  • Thanks Tomas and Mike - good advice - I have done that and found legacy stuff they've since moved away from - there is indeed no current use for the directives. I wonder whether there's any resource on the web that lists all robots.txt directives - and interprets them - if not then perhaps it would an idea for Moz?

    | McTaggart
    0

  • Hi, On our website we show a list of industrial products on a table form, we do it with a table format because we want to show the functionalities and capabilities of each machine. Regards

    | jcobo
    0

  • Also, some more information I can gather from your question: that noscript is telling non-js users/bots to meta refresh to an error page Google shouldn't be confused by that, but Screaming Frog would (and potentially other search engines) it is probably also not the best experience for non-js users: You can display an error messages without redirecting to another URL. Hope that's helpful...

    | sergeystefoglo
    1

  • Hi SharonEKG, Did you see Paul's response to your question? If it answered your question, please mark it as a "Good Answer." And if you need clarification, please let us know how we can help. Christy

    | Christy-Correll
    0

  • If the sub domain targets keywords not targeted in the rest of the website then rankings will slip. I would 301 all webpages to relevant pages on your main site. Any important keywords should be monitored. You should crelated pages with content from the sub domain to maintain these keywords. If traffic is non existent just 301 them.

    | Andrew-SEO
    0

  • The number of internal links you make to a page shows the level of importance you place on that page within your domain. Fact check first sentence - https://support.google.com/webmasters/answer/138752?hl=en That's the lesser point here though. There's a bigger issue I need to resolve. My most important pages are the ones google perceives to have far fewer internal links and these pages are not being shown in the serps, instead the lower pages are showing. I will progress with checking the js and robots to see if I can get this crawl sorted.

    | Andrew-SEO
    0

  • Thanks István Keszeg & Gaston Riera. That helps a lot.

    | BBT-Digital
    0

  • The majority of the configuration needs to happen on the server hosting the main website, Alex. Because essentailly, the visitor isn't even going to be aware of the existence of/URL for the other infrastructure. They're going to go to example.com/blog, and example.com's server configuration is going to deliver them the content from someotherserver.com which hosts the WordPress install. In addition to the proxy configurations, you may need to deal with cookies, SSL configurations and potentially other server header information that needs to be maintained between the requests passing back and forth between the different servers as well. This is a pretty common requirement in enterprise configurations though - to keep the software running the blog from potentially interfering with or compromising the security of the main site infrastructure. So like I said - eminently do-able, but not trivial to implement. Does that answer your question? Paul

    | ThompsonPaul
    0

  • WooHoo I got it! I installed the custom post type plug in. Made a custom post type of personal injury Changed the custom post type slug to have dashes instead of underscores made a dummy post with the name whip its nitrous oxide

    | julie-getonthemap
    0

  • In general, from what I know, the WordPress codex itself is relatively SEO friendly. When we're talking about which theme to go with – it's true a custom built theme is probably going to be less bloated and provide faster site speed. I would encourage you to audit (or have someone else audit) your competition. If you have similar content, similar link profiles, similar brand strength, etc... then site speed could be a factor that makes a difference – especially when Google issues a page speed update. However, if they have you beat on other fronts, I would put more effort into link building or content generation before worrying about which theme to go with. Does that make sense? It depends on many factors, but I would estimate building a custom theme will be at least 2X the work of utilizing a theme. And yes, the maintenance of custom themes and custom plugins is a good bit more involved than simply updating pre-built themes and plugins.

    | brooksmanley
    0

  • Hi Taylor, great question! I think the first thing to remember is that AMP is something you want to implement alongside a mobile-friendly website, not as a replacement. AMP pages by nature are faster than the typical webpage and so they should not be affected by this, according to Google's statement: "The “Speed Update,” as we’re calling it, will only affect pages that deliver the slowest experience to users and will only affect a small percentage of queries." They also note that query intent is a very strong signal and so in theory a very slow but very relevant page could still outrank a faster, less relevant result. My take on this update is that it is a sign of Google starting to focus on making the shift to mobile-first indexing. If you want more info on that, I wrote a blog post here on Moz about it recently: moz.com/blog/mobile-first-indexing-seo

    | bridget.randolph
    2

  • Thank you that answers my question.

    | seoanalytics
    0

  • Thanks, Marcus! Yes there will not be much crossover so this does feel like the best solution. I agree too about the homepage including more supportive content than Culligan includes as well as developing the structured nav at that early stage. Very helpful.  - Mark

    | m-johnson
    0