Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • No problem Bruce. I would definitely recommend that you look through some recent blog posts, and find some folks with high mozpoints. Hope this helps Bruce

    | DonnieCooper
    0

  • For #1, your approach is solid. For #2, the waiting period length is dependent upon how quickly your entire site is updated in Google. This time period varies based on the size of your size, how deep some content is within your site, the popularity of your site's pages, and other factors. Post migration you want to confirm the 301 is working, ensure your pages are canonicalized correctly, then submit an updated sitemap to Google and Bing. Presently your Google SERP should show the links to your existing domain. After the migration, your most popular pages should show the links to the new domain within a day or so. It would not be unusual for it to take several weeks for all your pages to be updated. I would recommend waiting to make changes until after the majority of the site's pages had their URLs updated in SERP. That is the best measure that the migration has gone smoothly. Post migration, you can then make any desired changes to the site and track the results from those changes without wondering about worrying about the migration contaminating your results. For site-wide changes, you can start off with adjusting a few pages and see how your results are impacting before applying the change to your entire site. The campaign tool is wonderful. Look at the tutorial and learn everything you can about it. It can help you measure and track your site's SEO performance.

    | RyanKent
    0

  • I don't know what the rest of that passage says, but I don't see in that excerpt how the author says they should be separate. All he is saying is what a blog is meant to accomplish. It can still be a part of your site, hosted on your domain, without being fully and completely integrated with all of the pages. It can look a little different, have a different feel about it, etc, but still be run within the parameters of your website.

    | CodyWheeler
    0

  • From Google WMT: "Sitelinks are completely automated, and we show them only if we think they'll be useful to the user. If your site's structure doesn't allow our algorithms to find good sitelinks, or we don't think that the sitelinks are relevant to the user's query, we won't show them." The only three things I can think of are: ensure your site is as crawlable as possible by reviewing and addressing any crawl errors consider the layout of your site for the site links. Do you have your site structured in such a way that it has good landing places for a site link? Hierarchical structures with category pages can work well for this use. relevancy to the search is also a factor

    | RyanKent
    0

  • I agree with Cody but would like to ask if you can offer a specific example. It always is helpful to look at the exact result.

    | RyanKent
    0

  • While I have read that the importance of a business name matching the domain name exactly could be deemed extremely beneficial, having the city name in with the appropriate keyword, and proper optimization should make for an extremely strong listing.

    | dignan99
    0

  • Hi Shirle, Could you provide more information on the error? I'm assuming that this is appearing when you look at a Crawl Diagnostic Summary? If so, is it an ERROR, WARNING or NOTICE? Clicking down through the error messages and then on an example of the error usually provides you with a short summary of the nature of the problem. i suspect may not be a technical issue i.e. something being 'broken' rather an issue related to the links generated by the trackback... Trackback (taken from Wordpress site) "Trackback helps you to notify another author that you wrote something related to what he had written on his blog, even if you don't have an explicit link to his article. This improves the chances of the other author sitting up and noticing that you gave him credit for something, or that you improved upon something he wrote, or something similar. With pingback and trackback, blogs are interconnected. Think of them as the equivalents of acknowledgements and references at the end of an academic paper, or a chapter in a textbook." Here is another clear summary of what a trackback is: http://www.optiniche.com/blog/117/wordpress-trackback-tutorial/ A trackback isn't bad for SEO if it generates a link to a genuine article on an above board site. However, spammers will try and exploit this to generate additional inbound links to their spammy site and of course you don't want to tarnish your own sites reputation by linking to spammy sites. If someone links to your blog post and you allow a trackback, by publishing it, you are giving a link back to that particular site. Google doesn’t reward excessive link exchanges and may lead to a site being penalised for this practice. You can of course turn off trackbacks in Wordpress if you can't see any benefit to them (or don't need them).

    | Hurf
    0
  • This topic is deleted!

    | TheHub
    0
  • This topic is deleted!

    | BMGSEO
    0
  • This topic is deleted!

    0

  • if you want to rank competitively in the US and AU then I strongly recommend local domains. In my experience, subfolders on a single domain are less effective then local domains. The above solution can work and you are at an advantage that your CA site is established and is the "parent" site. It is more work to maintain separate domains, but in my experience of managing .com sites, we have always faced a challenge because despite the target setting in WMT, Google still struggles to rank the UK site in the UK over the US site in many instances because it is on a .com and because there is a US version of the site.

    | Red_Mud_Rookie
    0
  • This topic is deleted!

    0
  • This topic is deleted!

    0

  • It all really depends I guess.  Is Johnston a city or suburban community?  If I sent something to your address and put Des Moines on it, would it arrive? How many people search for hotels in Johnston versus Des Moines? Either one you choose, you'll have to get everything to match.

    | dignan99
    0

  • Our normal site structure works fairly well for most of what we sell, but for some reason not for Widgets. The idea is just to shake it up and try something new. But like you said, I'm stopping and thinking about every aspect of this since it is something new. I'm trying to pick it apart from all angles before proceeding. Thanks for the insight!

    | rball1
    0

  • Speculating about the next couple years of SEO is one thing. The further out you go, the more variables you introduce and the less relevant the ideas become. A few years ago most people could not predict the impact twitter and facebook would have on SEO. Many still don't understand and need to be awoken with clear visual charts. I am not willing to speculate further then what I have, but I will watch your question and hope it spurs other ideas.

    | RyanKent
    0

  • More pages doesn't always mean better SERP position.  There is no black and white answer on this.  Here is a helpful excerpt from a recent blog by Dr. Pete : A common example is when you take a page of content and spin it off across 100s of cities or topics, changing up the header and a few strategic keywords. In the old days, the worst that could happen is that these pages would be ignored. Post-Panda, you risk much more severe consequences, especially if those pages make up a large percentage of your overall content. Another common scenario is deep product pages that only vary by a small piece of information, such as the color of the product or the size. Take a T-shirt site, for example – any given style could come in dozens of combinations of gender, color, and size. These pages are completely legitimate, from a user perspective, but once they multiple into the 1000s, they may look like low-value content to Google. The Solution Unfortunately, this is a case where you might have to bite the bullet and block these pages (such as with META NOINDEX). For the second scenario, I think that can be a decent bet. You might be better off focusing your ranking power on one product page for the T-shirt instead of every single variation. In the geo-keyword example, it’s a bit tougher, since you built those pages specifically to rank. If you’re facing large-scale filtering or devaluation, though, blocking those pages is better than the alternative. You may want to focus on just the most valuable pages and prune those near duplicates down to a few dozen instead of a few thousand. Alternatively, you’ve got to find a way to add content value, beyond just a few swapped-out keywords. Source In your case, if you are able to truly make pages that are unique, valuable, and genuinely helpful to users, you definitely stand  to benefit your overall traffic.  However, If I were in your position I would focus on making more like 50-100 pages with better, higher quality content, rather than 1000's of pages with just a little content.

    | GeorgeDavis
    1

  • The reason I would like to make the change, is to make the urls easier for visitors to remember, also, to make the urls look more professional. I personally feel that example.com/blah.html just looks kind of sloppy. Thanks for your answers.

    | Aftermath_SEO
    0

  • I have a lot of thoughts on this subject. If I was to make a blog entry on this topic, it would span multiple pages or have to be broken down into sub-topics. I do think there is a correlation between good code and search engine rankings. I do not think there is a correlation between a w3c validated page and search engine rankings. The validator is not current enough, nor flexible enough, to accommodate the real world situations which websites encounter. Example A: HTML 5 is recognized by all major browsers. W3C validation of HTML 5 is still experimental. A specific example that applies to SEO is the canonical tag. According the the W3C validation site, the canonical tag is not currently valid. Take a snippet of HTML5 code which passes validation, add a canonical meta tag to the header, and the code will no longer pass validation. This is a direct conflict between best practices and validation. Example B:  The world's most popular web page, google.com, does not pass validation. Matt Cutts discussed the topic. In short, they had a choice between providing code which validated, or code which worked. They chose to go with the working code. Example The standard facebook widget code, youtube video code, and other popular code does not pass validation. Whenever I design a website, I check the code in the validator to look for errors. Initially, I will find numerous errors related to code outside of my control such as social sharing widgets or youtube videos. Once I remove that code, the page often validates. I have researched the issue and it is possible to modify the facebook code or youtube code so that it still functions and passes validation. Doing so requires extra effort, it provides absolutely zero benefit other then saying "hey, I pass validation", and there are often drawbacks such as having to add extra javascript to your site which can otherwise be viewed as unnecessary code.

    | RyanKent
    0