Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Web Design

Talk through the latest in web design and development trends.


  • 1st works fine for me.... Redirect 301 /tags/xp/ http://www.test.com/tags/windows-xp/ if it is not working there must be some other problem!

    | MoosaHemani
    0

  • Hey everyone! I fixed the problem, the answer was here: https://my.bluehost.com/cgi/help/498

    | DougHosmer
    0

  • I agree - when adding 'alt tags' it is essential to describe the image and not stuff for keywords.  Typically I will work my keyword into my description if it doesn't look spammy.  For example, If I am working on a fence companies website and they have a page dedicated to "city fence"  I will make the alt tag "privacy 'city' fence |  6ft pickets " or something to that extent.

    | BeardoCo
    0

  • You can also add meta descriptions to the tags individually in WP.  Just go to "Posts" >  "Tags" and then edit each tag to have a meta description. On a side note, in a wordpress site, it is really not a good idea to index your "Tag Pages".  Sure 3 years ago you would have wanted to because Search Engines put more value on how many pages your website has.  In 2013 it has a number of negative consequences that Google especially doesn't approve of.  First it adds a ton of pages to your site too quickly - Google likes things to grow at a common rate.  (If you have a team of 30 people blogging 5 blogs a day Google will see that and be ok with more tags, but if your adding one blog a day and 25 pages are being created thats a problem)  Second, if you add one blog and give it 5 new tags that have never been used before, Google will see 5 new pages with the exact same content on it.  You will see these errors on you SEOmoz Campaign report.  Lastly, its hard to control the meta descriptors though easier with a SEO program. Leaving out your Tags but indexing your Categories may be a better option, but of course that draws the question, why use tags at all right?  Well in the search engine world, its best to create for the user and not the search engine.  So tags are used to help the user find other articles or blogs like the one they have landed on.  In the last 5 years, tags were about creating huge amounts of pages but now days you have to weigh the good over the bad. -this is my opinion and not that of SEOmoz.-

    | jonnyholt
    0

  • I don’t really think it will be simple enough to rank better with a website that contains all the images on some other server. It might help you with site load time but if the CDN is not SEO friendly probably you will see a dramatic fluctuation in rankings. I would highly advice you to use self hosting ecommerce website and then properly optimize it according to the SEO standards in order to rank better in SERPs.

    | MoosaHemani
    0

  • If the "Subscribe" box is for RSS notifications, why not just have an RSS icon leading them to the feedburner page where they can subscribe? Personally I think that would be better than having a generic subscribe box just about the flashy newsletter signup. I'd figure a lot of people wouldn't understand what the difference was when first looking at your page and either mistakenly subscribe when they meant to get the Newsletter or overlook the subscribe box completely.

    | MikeRoberts
    0

  • Matthew, much appreciated. Thankfully I don't need to worry about redirects since it's just a transition to a new template. About 90% of the other elements will remain intact. Checking webmaster tools after the transition sounds really helpful. ps. Thanks to everyone for your great responses!

    | howardd
    0

  • I think what you're seeing here is intentional behaviour, Stephen. It's Volusion's hack for working around the fact their system doesn't handle 404's "correctly". Bottom line, when you see these, you still need to fix the issue with whatever URL was being sent to the 404 page, but don't worry that the 404 page itself seems to be "not found" according to its status code. Here's an explanation for "why" this is happening, if you're interested: Normally, when a user enters a URL that doesn't exist, the server sends back a 404 error header. In addition, the server's settings know that when sending back a 404 status because there's no such page, they should also show the server error page directly. For a number of reasons, Volusion can't do this, so instead, they've instituted a catch-all redirect so visitors to non-existent pages get a 301-redirect to a regular website page that has been faked to look like a 404 page. Because that 404-looking page has been found and shown, it would normally have a 200 status, which means page found OK. A little unorthodox, but OK so far as far as the user is concerned. BUT! When a user hits a "page not found", the search engines want to get an actual 404 status error code back so they know not to index that non-existent URL. See the problem? If the search engine gets a 200 response, it will assume that is the real page the visitor was trying to reach and will index the non-existent URL with the 404-ish looking content. Bad. So even though you - the user - can see the error page (200), Volusion has to give it a fake 404 status to give the search engines the correct information. For a demonstration, go to this non-existent page http://www.cochranemusic.ca/oops You can see in your browser's URL bar that the page address is still http://www.cochranemusic.ca/oops even though the page itself shows the server error page content. Now go to http://www.ecowindchimes.com/oops and notice that the URL in the address bar actually changes, because you've been forwarded to a page on your site called 404.asp. That's a real page on your website you're seeing that's been made to "look" like a server error page. Even though you've been redirected to a real (200) page, the server has to pretend it's a 404 status to mimic the correct behaviour. Whew - that was confusing to try to explain, so let me know if it's still not clear. Paul P.S. To server admins: I know I've oversimplified the difference between a server's own 404 error page and an actual website page made to look like a 404. I do know the difference, but for the sake of keeping this explanation as straightforward as possible, I've glossed over it.

    | ThompsonPaul
    0

  • why wouldn't you put in root and why no subdomain? We aren't planning on doing a subdomain. We're planning on using two different domains with Multistore

    | SheffieldMarketing
    0

  • Hi! You can't make those yourself, but there are some things you can do to make it more likely that Google will make those links for you. Search Engine Land has a good post on this at http://searchengineland.com/google-jump-to-links-within-search-snippets-26603

    | KeriMorgret
    0

  • Hello Tom, Five or six years ago this may have been a problem, but I seriously doubt it is right now. The text is clearly visible to humans and bots and you are not doing anything malicious. The div with the text appears over the image and it views fine in text and full mode when looking at Google's cache of the page. Though both Pedram and Donford have answered your initial question well, I could not officially "endorse" them because I don't advocate keyword density ranges and do not agree that a manual review of the site would result in you being subject to any kind of penalty. Write naturally for the topic at hand, focusing content on the correct keywords and topics with the goal of being helpful to the reader. Don't worry about manual checks for this because it will be obvious to the person looking at the site that nothing fishy is going on. Good luck!

    | Everett
    0

  • Hi there, In order to tell Google that your sites are targeted to different locations (.com for the US and .co.uk for the UK) and that despite having very similar, almost duplicated content this is not the case you should: Add the rel="alternate" hreflang="x" tags to your pages, targeting them to their specific countries:  hreflang="en-gb" for the UK and hreflang="en-us" for the US. Like this you're specifying that these pages are both in English but targeting to different countries. Even if you're offering the exact same products or services try to differentiate the content by localizing key elements of your pages content. You can do this by adding the name of the countries (or cities if you have different content per cities) in your titles, meta descriptions, headings. Also adding the specific addresses and currencies you have for different countries too. Like this search engines will see that the content is not exactly the same and includes references to different locations. Take a look at this post I wrote some time ago where I shared the main criteria and aspects to take into consideration for International websites. Thanks!

    | Aleyda
    0

  • Are you using a CNAME of your main domain for that server? Like sounds.example.com? You should, if you aren't. I don't think pagerank can flow to a sound file. But I am not entirely sure, tho.

    | FedeEinhorn
    0

  • I agree with Jesse. Pull quotes isn't the problem. Copy article and add nothing extra of value is the duplicate content problem.

    | Thos003
    0

  • There is no harm that it can do to your SEO but, in my opinion, dark text on light background is just easier on the eye. What is your bounce rate like? If it's quite high (50%+) then it may be worth looking at.

    | KarlBantleman
    0

  • He seems to have sorted it out now -- touch wood. Thank you for your replies.

    | Jeepster
    0

  • Hi Laura Both options are feasible really.  301ing the domain will pass the link equity to the shed finders site, which in turn may help it to rank for that keyword.  It may warrant a bespoke landing page for the keyword, in which case creating a URL for that keyword and then redirecting the root domain to that page would be a good solution. However, you or your client may prefer to rank for the keyword for this microsite - if they feel that site better suits the purpose of the keyword.  In this case, carrying on your link building and inbound marketing efforts for that site should inevitably help it rank for that keyword.  You would want to make the site as informative as possible, while also making it as authoritative as possible - which may include guides, articles, videos and other rich media, social media integration and so on.  Creating this 'authoritative status' could help it rank quicker, as Google does give favour to those sites that can be recognised as a brand. If you do all of that, you would feasibly be able to link the sites to each other - but do take care if you do so.  It could very easily look like a link network or scheme if it is overdone, while the reciprocal link itself could look suspicious to Google if it looks as though it is just there to pass PageRank and not direct the user to another authoritative website.  A recent Matt Cutts video on site interlinking may be of interest to you: http://youtu.be/x0-jw_PfwtY Hope this helps Laura!

    | TomRayner
    0

  • Sadly, no. It's tricky, but your best bet is probably to deliver a non-AJAX version to Google (or make the AJAX crawlable, although that depends entirely on your implementation) and then use rel=prev/next on Google's version. This is tricky at best, but Google has to be able to crawl the paginated URLs somehow. Just so I understand - the "View All" isn't really a view all without being able to call the AJAX, right? You might want to check out: https://developers.google.com/webmasters/ajax-crawling/

    | Dr-Pete
    0

  • Thanks, Dr. Pete. I'll discuss the options with our dev team and see which one will cause the least amount of developer caffeine consumption.

    | sbaylor
    0

  • Visits dropping to zero is a bit odd, have you confirmed your analytics code is installed and tracking properly?

    | LynnPatchett
    0