Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Web Design

Talk through the latest in web design and development trends.


  • If you download the CSV report, it should list the referring URL.

    | KeriMorgret
    0

  • Yes, you are doing the right thing.  You may also want to look at including Meta Tags in the as well. ()

    | DRSearchEngOpt
    0

  • Thanks all for suggestions...

    | dsingh1079
    0

  • Hi Carl, You should be able to do this with one rule: Options +FollowSymLinks  RewriteEngine on  RewriteRule (.*) [newdomain.co.uk] [R=301,L] Give this a shot and see if it works. It should direct http://www.olddomain.com/villa_rental.php to www.newdomain.com/villa_rental.php and so forth. Cheers, Jane

    | JaneCopland
    0

  • Yeah hiding keywords in your page is definitely not cool with Google.  Many people do it by making the text the same color as the background.  No one can see it, unless they highlight the area.  Used to be effective, now it's just spammy and will get you banned.

    | TotalMarketExposure
    0

  • Hi Morris, I am not seeing any issues, either. Please confirm whether or not this has been resolved, thanks! Christy

    | Christy-Correll
    0

  • Thanks! Yes I see the top comments in the discussion are around that topic. Can't wait to see what the determination is on impact of Titles and/or description specs.

    | IrvCo_Interactive
    0

  • Thanks so much Gianluca for this thoughtful and valuable advice. Yes, page load speed is definitely something that's been a concern. This is why we went back to 24 products displayed per page instead of 50 a few months ago. However, since then we've made some significant improvements in page load times and we think we can probably go up to 100 products per page and still be fairly fast. We will have to test. On the up side, we only have 7 categories with more than 100 products, and only 24 with more than 50. The biggest problem we have effecting speed isn't so much the images. It's the fact that the website does real-time pricing calls on every product to ou business back end every time the page loads. This may be a sticking point. I have also thought about the canonical tag problem. Of course, it's a problem now too, but if the "View All" page just ends up getting that generic URL and no proper canonical tag...then we really are back to square one. The possibility of no-indexing all of the categories that are related to paginated series is something that crossed my mind yesterday, so it's interesting that you mentioned that. While it would solve certain issues, wouldn't this be a problem in terms of having valuable content in Google? Granted, some of our category pages are purely there for navigation purposes, in which case, I suppose there's no harm in no-indexing them. However, with the roll-out of Hummingbird I began looking at our category pages as valuable opportunities for "topics" pages that could act as a hub for visitors searching for products or information around specific uses or brands. Wouldn't there be a significant risk in losing valuable market share for key terms by removing so many category pages from Google's index? If I am understanding your last suggestion you are saying to have the page default to "View All" and noindex everything else...You are right, not a great scenario, but you are also right in that this may be the only solution given management's steadfast stance on not wanting to pay to fix it. Lot's to think about, but your comment has been extremely helpful. Thanks again!

    | danatanseo
    0

  • Thank you, for your help. Andy and Michael

    | FhyzicsBCPL
    0

  • Thanks everyone! Will check these out when I get a chance, prepping for SMX West at the moment.

    | KeriMorgret
    0

  • I'm using a tool called GSiteCrawler at the moment, I'm new to it, however it will list all crawlable pages and create a sitemap.xml for you too!

    | danwebman
    0

  • This could turn into an extremely long post if we go into everything in detail here A good site structure should be one that is hierarchical and sensible. By this, I mean it should make sense when someone lands on a page, what your intentions are for them and how to achieve them. Keep your call-to-actions clear and concise and don't burden the page with wasted content. Your URLS's should again be descriptive, so that if someone were to land on an internal page, would the URL tell them a bit about where they were? Are the key phrases you are targeting included in the URL itself? Also, you should aim to keep your URL structure to within 3-4 click of the homepage. Any more and it turns into a poor user experience. Your meta titles and descriptions should also be in target for the pages themselves. Something like this: <pin code:="" lohit,="" arunachal="" pradesh,="" india,="" <your="" site="" name="">or  <pin codes="" for="" lohit,="" arunachal="" pradesh,="" india="">Whatever you decide to go for here, I would keep i to under 60-70 characters, with the most important information towards the start.</pin></pin> I hope that helps a little. -Andy

    | Andy.Drinkwater
    0

  • Hi Tim, Go with the approach you propose in the post, that's the way it should be done properly to make sure the old pages still will be redirected to a new one, preferably with the 'old' content.

    | Martijn_Scheijbeler
    0

  • I read your post at Mstoic Hemant and noticed your comment about Firefox 10. Since I couldn't get Dust-Me Spider to work in my current version of Firefox I tried downloading and installing the older version 10 as you suggested. When I did so, I received the message that the Dust-Me Spider was not compatible with this version of Firefox and it was disabled. We are considering purchasing the paid version of Unused CSS (http://unused-css.com/pricing) - Do you have any experience using the upgraded version? Does it deliver what it promises? Thanks!

    | danatanseo
    0

  • Hi Chris, Don't get hung up on load times. These are not going to make a lot of difference if you have an amazing site. I have a number of sites of my own, all using a Wordpress single page theme that was purchased from there and I rank 1st to 4th for a range of very highly competitive phrases. Find a theme you like, look for good reviews and don't worry about having to go with a specific framework. -Andy

    | Andy.Drinkwater
    0

  • Structured data - Schema is Good for SEO ! So ... Check if structured data aka schema, microdata is being used on any given web page with webmaster tools <a>www.google.com/webmasters/tools/richsnippets</a> If you are trying to improve the page you can also use the Structured Data Markup Helper - html conversion tool to have Google auto-add schema to the page after you choose the relevant category: <a>https://www.google.com/webmasters/markup-helper/</a> This is of course just one more method to use in addition to others mentioned.

    | DaveBrown333
    0

  • Thank you Chris. I appreciate you taking the time to answer this question for me. Best wishes, Amelia

    | CommT
    0

  • Thanks Takeski , I put this into the footer and it worked!

    | Wesley-Barras
    0

  • Thanks for the question: "To maintain or improve our rankings i'm looking for specific information for the link structure. For example, is it better to have the 'about us'/rel=author on each domain, with contributors on that specific domain or is it better to have them all in the (umbrella) brand domain." I'd say that it comes down to how much you want to differentiate your current brands from each other. If you want to keep them quite distinct and each one has it's own team / writers / USPs etc, then you are best to have each domain have it's own sections. "I think to maintain the rankings it is best to keep specific content (like blog/ about us) on the domain. So is it the best to just do side wide links with a logo (like health.com) and what about hosting? We work with wordpress, so all domains will be hosted on one ip? when we use the multiple site option of WP?" In terms of cross-linking, I'd say that you should try to do this where relevant and as James has pointed out below, as long as you're not doing this in huge volumes, you should be fine and not trigger any problems with Google. The only time you may want to be careful is where your domains are competing for the same types of keywords. I would avoid cross-linking too much with exact match keywords as this could be seen as manipulate, I'd keep it on brand and to relevant pages. If the domains you own are in the same niches, then you may hit problems of them competing against each other. I'm not sure if this is the case? If they are in the same niche, then Google certainly can look at things like IP address and footprints that links the sites together, it is hard to know to what extent this would harm you but Google certainly prefer to show diversity in search results rather than having multiple sites from the same company.

    | Paddy_Moogan
    0