Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Latest Questions

Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!


  • Hi Trevor, Math in linkbuilding is complicated. I believe that you should not get attached to any idea like: "Only high PA links" or "As many links as possible". It should be a mix of those ideas. The linkbuilding target should always be "diversificate", make it look natural. Although we know that linkbuilding isn´t always natural. I prefer getting some high authority links, some low athority links, some medium authority.. all those separated in time, some days/weeks in between. "make it look natural" Never forget to mix anchors, exact URL, dofollow, noFollow and irrational anchors (sourche, click here, more info) Hope this is useful for you bye

    Link Building | | GastonRiera
    1

  • Yes, and I think the way that Web 2.0 sites (as defined by the way you are using the term, Jubaer), including those hosted on Wordpress.com and Tumblr, is a great example of why search engines treat subdomains as distinct domains. Practically anyone can set up a site on Wordpress.com like abc.wordpress.com with minimal effort. That is, after all, the primary service Wordpress.com offers, after all. The sites that exist as subdomains of Wordpress.com have very little in common with each other, other than being hosted on Wordpress.com. Wordpress.com has a DA of 96. There are some truly fantastic sites hosted as subdomains of Wordpress.com, and there are also a lot of really crappy ones, too. If search engines treated the value of a link from crappysite.wordpress.com as authoritatively as the value of a link from fantasticsite.wordpress.com simply because both are hosted on wordpress.com (which does, as you say, have a DA of 96), would that be best for users? Nope, it would not. Someone could have made crappysite.wordpress.com for the sole purpose of "earning" a link to their own (self-hosted) site, or created it with good intentions, but never put in the work needed to become a valuable site for its intended audience like fantasticsite.wordpress.com did. And, let's not forget about the importance of relevance. The topics of sites on Wordpress.com  are all over the place. Therefore, it makes sense to treat each one separately, right?

    Local Strategy | | Christy-Correll
    1

  • I'm going to go ahead and lock this thread to future comments.

    Local Website Optimization | | MattRoney
    1

  • Run Screaming Frog on your subdomains and check the Images tab in the report, then sort by image size and you'll find the large images. Download Screaming Frog from here: http://www.screamingfrog.co.uk/seo-spider/

    Intermediate & Advanced SEO | | Gyorgy.B
    0

  • I understand your fears completely because WP + themes + plugins can be HUGE mess. Why? Because some of devs don't know technical and on-page SEO. I have seen themes where they put text "Comments" within H1 tag. At same time post title was encapsulated with H3. Things as hidden text, messy HTML codes, bloatware HTML are countless. You also can get server overloading, slow SQL queries and some issues. Can be performance issues, PHP issues, hosting issues. Running WP mean that you will get 10 CSS files and 10 JS libraries. And this is "best case scenario". Just imagine what is worst. Now add "upgrading" procedure where everything can be broken (or changed!) with just one click. Ah, and mine favorite - security exploits and hacking. Sound like "perfect storm". Isn't? Now i know that this sound scary. That's why you should see and review HTML code of WP before migration. Probably you will see potential for improvements. And this changes need to be patched over original files (i'm talking about theme or plugins). For theme is OK - you can make child theme based on original. But for plugins - you need to "fork" original plugin and make your own custom version. Then on each update you should "diff" your version and original to keep your patches and new code. This mean very strong backup solution plus local dev environment and extra work on each update. Also - it's year of 2016. Why you don't around for alternatives? I can give you suggestion - static site generators: https://www.smashingmagazine.com/2015/11/modern-static-website-generators-next-big-thing/ https://www.smashingmagazine.com/2015/11/static-website-generators-jekyll-middleman-roots-hugo-review/ As you can see i'm not giving answer Yes or No. I'm just giving you few extra points to think about. Just put cards on the table. PS: May sound negative little bit because i have some theme and plugins in past. One wrong choice and all SEO efforts can be ruined. Such is life... Glossary: "fork" - process of creating different version of something existing with some changes that doesn't exist in original. You can hear that devs are forking projects too. In your case - since you can apply your patches but on next update everything will be gone back to original. That's why you need to fork them. "diff" - process for checking difference between files/project and extracting/showing only difference between them.

    Web Design | | Mobilio
    0

  • Hi Nikos, It's important to remember that Keyword Difficulty scores are a Moz metric, not a Google metric - they are based on Moz' ability to judge how well other sites are competing for that term, and may not capture the entire competitive landscape (since nobody except Google knows everything that Google looks at). Based on your ability to rank well for some terms and not others, it doesn't seem likely to me that you are under any sort of penalty, so much as that Google just isn't ranking you for some terms. In addition to the Keyword Difficulty scores for each term, take a look at which sites rank for the term (you can do this in the SERP Analysis feature of the Keyword Difficulty tool. Ask youself: What kinds of sites rank for this term? For example, if you are an individual business, but all of the sites and pages that are ranking for that term are aggregators or lists of multiple sites, it may be that Google has determined that an individual business site is not a good fit for that query. Similarly, if your page is a blog post and no other blog posts appear in the SERP, Google may have decided that a blog post isn't what people are looking for when they search that term. What is the search intent of the query? Based on the other pages that rank, what is the question or task that Google has decided users are trying to answer or complete when they search this term? Does your page do a better example of helping answer that question or complete that task than the other pages that rank? What types of content are ranking? Do they all have rich snippets? Are there images, video, shopping or maps results? All of these will tell you more about the kind of content Google thinks will match this query. Is there a specific page or website that is ranking for that term that you think you could push out of the top 10? Look for areas of opportunity. For example, maybe there is a site with high authority, but the page that ranks has very low page authority and doesn't fit the query very well. Try to create a page that is better than that page, specifically. How closely is the phrase related to your niche? You can tell from the keywords you are successfully ranking for, which topic areas Google is associating with your site. If you have a whole site about chocolates, it will be harder to rank a page about asparagus, even if the difficulty score is lower. Also, don't forget to continue promoting your content to earn high-authority links to individual content pieces. Where it makes sense to do so, you may also want to link internally from some of your more popular and successful pages to some of the pages that are struggling. I hope that helps!

    Intermediate & Advanced SEO | | RuthBurrReedy
    0

  • The little box you talking about. Make a small table or div and float it to left. When you say duplicate menu at the bottom, do you mean main menu or sidebar menu. You can do either.  Anything you want.

    Web Design | | EGOL
    0

  • Hi There! Thanks for your response!! I emailed our IT group about the sitemap observation. The site went down on January 4th. It was only for 30 minutes. We noticed the odd SERP results on Wednesday- January 6th. According to Moz our Keyword rankings for 'duluth trading co" & "duluth trading company" have been all over the place. See here: http://screencast.com/t/Tgc3qZgYrd & http://screencast.com/t/uvgHeRGNy I also noticed that "https" is showing up and not "http". I'm not sure why that is happening as well...

    Branding / Brand Awareness | | sderuyter
    0

  • It dosen't have any backlinks but I'm stumped as to why it isn't ranking higher than what it is, I'm not expecting 1st position but outside of the top 50 is something else. No backlinks and a recent domain registration date are big answers to your question.  This isn't an especially difficult SERP, but a person can't walk right in and expect to displace sites that were on the web and working to gain visibility ten, even twenty years before your first upload.  That's the situation when you arrive late to the battle. Just as a comparison.  If I upload an article on a twenty-year old domain with a DA of about 78 and a keyword of similar difficulty, that article might not rank in the top 100 for months, and might not rise to the first page for a year or more.  The people on the first page for your keyword are making money and will fight to hold it.

    On-Page / Site Optimization | | EGOL
    0

  • Hi Kaitlin! The radius from which Google draws local and localized organic results is really dependent on competition. There won't be a single answer to your question, because it's going to be different in each case. For example, if you are located in a very rural area with few options, Google will reach out beyond the borders of your town to adjacent towns to return results to make up a full set of results. In some cases like this, there won't even be a 3 pack, but solely organic results. When you are dealing with a large city, you are much less likely to see this outreaching behavior on Google's part, because they will have plenty of results right near the user within the city. The only exception to this would be if the business is offering something very unusual and there are few or no competitors inside the city. For a Service Area Business, the rule of thumb is to go for local pack rankings for their city of location and organic rankings for their service cities. It's rare for a service area business to rank in the local pack for any city where they lack a physical location, unless, again, they are offering something very rare. Doing research on Maps will help you determine the general radius from which Google is drawing results for a particular query, but it's extremely important to remember the user-as-centroid phenomenon, especially when dealing with cities. Google will show different results to users at one end of the city than to those at the other end of it. Educating clients about the fact that there are no static rankings is vital these days Hope this helps!

    Local Listings | | MiriamEllis
    0

  • Hi Alex, This would depend on how you create your segments. If you go with session based then it will only show users/sessions that have completed multiple goals within the given time frame. If you make your segmentation user based then it will show you users who have completed multiple goals (even if it was outside of the specified time frame) that have had a session during within the specified time frame. Try looking at the same date range with both a user based and a session based segmentation for multiple goal completions. You should notice that when adjust your segmentation from user to session based that the number of sessions/goal completions changes. Keep in mind that unless you are using a unique user id based segmentation that these views will be based on the default GA definition of a 'user' which takes into consideration unique browser/device and can potentially count the same user twice. Hope this helps, let us know how it goes!

    Behavior & Demographics | | troy.evans
    0

  • Hi James! I definitely understand what you're saying. Please know that getting every question answered to the best of our ability is a top priority for us. When we, as admins, see that one hasn't been answered, we utilize an entire staff of expert Associates to whom we assign questions. In fact, an Associate had been assigned to your question shortly after it was initially posted; he just hadn't gotten to it yet. Because our Associates and our community members are located all over the world and have their own careers to consider, often it takes a day or two—and, occasionally, more than that—to get an answer. This is especially the case if it's a particularly complex or specific question. It looks as if you removed the text of this question the same day it was posted which, frankly, is nowhere near enough time to determine that the question won't be answered. Let me know if you have any questions about Q&A or our admin process.

    Getting Started | | MattRoney
    0

  • Hi Andrew This is very helpful thank you, I have already put together a case for improving our page speed so it's something I'll push harder with the developers. I am also working on a section for the site which will include user guides and helpful articles so this is great Thank you!

    Intermediate & Advanced SEO | | BeckyKey
    0

  • Yes - there is bug in your robots.txt. You should wrote some as: Disallow: /?display=table or: Disallow: /?display=*

    Intermediate & Advanced SEO | | Mobilio
    0

  • Great thanks everyone

    Search Engine Trends | | BeckyKey
    0