Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • You're very welcome! Do feel free to reach out if you need any more help.

    | andy.bigbangthemes
    0

  • Hi. I think mememax gave a very good answer. The only thing I would submit for consideration is making too many changes at one time can be hard to track later. When we did the switch to https, I was super paranoid we would screw something up and lose rankings. So I chose to leave the disavow file exactly the same. It turned out the switch was not as bad as I thought and we didn't have any noticeable effect on rankings. So later when I was convinced that the https switch was not a factor, I could modify the disavow file. I also left the old domains from years ago in there for the reasons mememax points out. Good Luck!

    | Chris661
    0

  • Agree with Kevin's question, but 99% of the time you want to use a 301. 302 is causing google to index the new page without dropping the old one. This means that now google is seeing 2 pages with identical content. While this is not an issue per se, it may dilute a lot your seo value as it's split across two pages. Moreover, sometimes Google may decide to serve the old page and not the new one as it may have backlinks pointing to it. I strongly suggest you update your redirect rule to 301. I don't know specifically about Magento but there are many discussions out there. Here the code you may include in your .HTACCESS file. *NOTE: do not touch the htaccess file if you don't know what you're doing or you can break the whole site <code>RewriteEngine On RewriteCond %{HTTP_HOST} !^www\. [NC] RewriteRule ^(.*)$ http://www.%{HTTP_HOST}/$1 [R=301,L]</code> you may want to change the rewrite rule to https://www. if your site is secure.

    | mememax
    0

  • Imo internal Link-structure could help or be a problem, but its hard to say, maybe These Pages are Ranking exactly where they should rank. May be its not necessary to have These categories or they cannibalize, hard to say without Note Info.

    | paints-n-design
    0

  • Moz did claim in a few Whiteboard Friday videos that they were able to prove Google does take grammar/spelling into account.

    | andy.bigbangthemes
    1

  • Yes, but article got 1y so it can be not so accurate now.

    | PenaltyHammer
    0

  • Hi Kenneth, I think it depends on whether you truly operate as a local business within that city location. If you intend to advertise to a specific city then the intent changes from finding you on a national level to finding you at a city specific level. If you truly operate (and you haven't said) from that city location then you could really optimise the page as city specific so would rank highly in that local area. You could make the page different from the national page by including photos of the city with appropriate Alts and a little about the city itself. You'd find it relatively easy to rank at a local level for the page. If you do not operate at City level (with a local office) and are a national company simply targeting a specific city to sell to then I would canonicalize the page back to the generic. It begs the question though why you would want a city focused page in the first place and why the national one wouldn't suffice. I hope that clears (and not muddies!) your thinking! Regards Nigel

    | Nigel_Carr
    0

  • Having "wicker" in the domain is only one small ranking factor of 200+. Although they used to be ranking higher, a drop in rankings could be due to a number of different factors, especially if you are talking a few years ago or more. There have been many Google updates that could have had an impact on your client's site.The first thing I would do is a competitor analysis on the top 3 ranking business that are showing up in the SERPs you want to show up in. Some of the following questions and comparisons would be a good starting point. How Many Domains Do They Have Linking To Them? What are their top 5-10 backlinks? How Much Unique Content Do They Have On The Page Ranking? Is It Well-Written & Structured with Pics and/or Video? What Is their Meta Title (What you click on in the SERP)? How Close Is the Keyword To The Beginning of Their Title? If Attempting To Rank Locally.. What is Their Moz Local %? (https://moz.com/local/search) How Fast Does Their Website Load? (I like using Pingdom and Pagespeed Insights for this) Compare those answers with what you see for your clients website and I am sure you will see some areas for improvement, but it will also give you very good goals as far as content creation, link opportunities, technical SEO edits, and more.

    | LureCreative
    1

  • Why couldn't I just put a password on the staging site, and let Google sort out the rest? Just playing devil's advocate.

    | EugeneSong
    0

  • Hi There! The best practice here would be to implement those 301 redirects to prevent the possibility of duplicate content. It does not appear that any https://www. pages are in Google's index now (screenshot) but the potential is there with both http://www. and https://www. URLs being able to resolve and the fact that they both contain self-referencing canonical tags. I would recommend making the switch to https:// given Google's recent announcement regarding the 'Not Secure' warning in Chrome 62 starting October (All http:// pages with forms will be marked as 'not secure' - more on that here).  This warning will be triggered on every page due to the 'request a quote' form in the right-hand sidebar. If you are interested in looking into this further, I recommend checking out this 'Secure your site with HTTPS' Search Console Help article. Hope this helps!

    | Joe_Stoffel
    0

  • The answer to this question depends upon -- the strength of your site -- the competitiveness of the niche -- who is competing in the niche -- how much work you plan to do to promote the page -- your skill level -- and many other factors One of the biggest problems in online marketing is faced by the client who asks an SEO to promote his website.   SEOs often give them a monthly price to work on the site - such as $1000/month - and have no idea of how long it will take to get a return for the client.  SEOs can give a severely low number and work happily for a long time without producing results. Why? Because the SEO is doing $1000 worth of work per month but the competitors are doing $5000, and the client sees his rankings fall after hiring the SEO.  Furthermore, the SEO might be charging $1000 and don't know what he don't know and that $1000 has negative value for the client - for example, those hit by Penguin. It takes a lot of experience, knowledge and study to know how much force must be applied to move the needle.   Sometimes you can know what to do and how to do it but you don't have the ability to pull it off. "One may know how to conquer without being able to do it." Sun Tzu "If you know the enemy and know yourself, you need not fear the result of a hundred battles. If you know yourself but not the enemy, for every victory gained you will also suffer a defeat. If you know neither the enemy nor yourself, you will succumb in every battle."  Sun Tzu http://classics.mit.edu/Tzu/artwar.html

    | EGOL
    0

  • Well yeah sure...but why not fix it in the first place? Too many redirects are not a good idea

    | andy.bigbangthemes
    0

  • mmm it depends. it's really hard for me to answer without knowing your site but I would say that you're in the good direction. You want to provide google more ways to reach your quality content. Now do you have any other page that is bringing bots there via a normal user navigation or is it all search driven? While google can crawl pages that discovered via internal/external links it can't reproduce searches by typing in your nav bar, so I doubt those pages should be extremely valuable unless you link to them somehow. In that case you may want to keep google crawling them. A different thing would be if you want to "index" them, as being searches they are probably aggregating different information already present on the site. For indexation purposes you may want to keep them out of the index while still allowing the bot to run through them. Again beware of the crawl budget, you don't want google to be wandering around millions of search results instead of your money pages, unless you're able to let them crawl only a sub portion of that. I hope this made sense

    | mememax
    0