Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Personally I think its madness to "no follow" any internal links.  When you "no follow" you are throwing link juice out the window, the days of sculpting links ( the practice of "no following" some links on a page so more juice flows though other "follow" links)  are long gone, yet is still see it being attempted all over the place.

    | PaddyDisplays
    0

  • Thanks for the responses guys.  Sounds like you both think using location pages is still a good way to go as long as you are not trying to fake having a location where you do not.

    | netviper
    0

  • Using the noindex,follow combination is a form of advanced page sculpting, which is not truly an SEO best practice. Here's why: If you deem a page not worthy of being in the Google index, attempting to say "it's not worthy of indexing, but the links on it are worthy" is a mixed message. Links to those other pages should already exist from pages you do want indexed. By doing noindex,follow, you increase the internal link counts in artificial ways.

    | AlanBleiweiss
    0

  • Hey David Thanks for reply. 3. Use a plugin to apply rich snippet markup to the individual product pages, adding another layer of "uniqueness" I had thought about this already and was looking into the MPN (Manufacturer Part Number) attribute for products (https://schema.org/mpn) however, it's not clear if, like SKU, the MPN needs to be unique to ProductModel (https://schema.org/ProductModel)? If that were the case, I'd have a problem as there are multiple MPN's per ProductModel. I see https://schema.org/isVariantOf too, which could be useful? Anyone with experience of Schema?

    | seowoody
    1

  • Hi, In my opinion, Google definitely does take certain reviews into account when ranking sites. What I mean by that, is reviews that can actually be measured numerically. These are the two types of reviews that I believe affect rankings: Google + reviews Reviews using schema to markup the rating Both the above can be easily measured by using the number of stars, points etc (which all represent a percentage out of 100%). By using a very simple algorithm, you could collect this information from across the internet and calculate how well on average a particular website (this can also be done for products, recipes etc.) ranks with the public. We have also had our fair share of negative reviews (it has to be expected) and have actively sought to increase our positive ones (honestly and naturally) to combat this. For us, it seems to have worked successfully. What's your opinion on this?

    | SilverDoor
    0

  • It's hard to say without knowing exactly what's been changed on the site/inbound links, but I wouldn't be surprised if it had something to do with the backlinks... this graph does not look normal and it appears a lot of backlinks have disappeared in July - https://ahrefs.com/site-explorer/overview/subdomains/?target=www.weddingphotojournalist.co.uk Edit - make sure to select the "One Year" tab for the graph

    | StreamlineMetrics
    0

  • Build some links! You have a grand total of zero links pointing to this domain from zero sources with a domain authority of 1 (the minimum, basically zero). I'm not sure why you're worried about your home page showing up when your domain/company is googled right now. I think you have bigger fish to fry. Those types of things will work themselves out as your SEO campaign takes off (ORGANICALLY!) Googling "docslinc" at least brings your domain to the first results. It's an FAQ page, sure, but it's you. Google is in fact indexing your entire site but that may not last long either as you do have some pretty major duplicate content issues approaching... Go to google and search this query - site:docslinc.com You'll see something like 1700 pages indexed and the VAST majority of them are duplicates of sign up and login pages. Examples: https://www.docslinc.com/login.php?dctid=Mjg=&prid=NTI=&tim=MTQ6MDA=&dt=MjAxNC0wNy0yMg==&bkap=inst&brkt=MzA=&apptid=&ins_plan=&spl=1 https://www.docslinc.com/login.php?dctid=Mjg=&prid=NTI=&tim=MTQ6MDA=&dt=MjAxNC0wNy0yMg==&bkap=inst&brkt=MzA=&apptid=&ins_plan=&spl=1 https://www.docslinc.com/login.php?dctid=Mzg=&prid=NTY=&tim=MDk6MzA=&dt=MjAxNC0wNy0xMQ==&bkap=inst&brkt=MzA=&apptid=&ins_plan=&spl=1 (Yes folks, those URLs are slightly different) They are all being indexed separately and that's a Panda Penalty waiting to happen. You need to get rel canonical tags on those pages and point them all to one or no crawl them altogether ASAP. That dupe content issue will not solve your original question, but is very important nonetheless. To be honest I don't think your original concern matters all that much. Just get to building your marketing campaign and the rest will follow. Good luck!

    | jesse-landry
    0

  • Always link by page and subject matter. If you have an article about red widgets, you should link back to the page you have about red widgets. True you can build up your home page with a lot of links, but this hurts your user experience by making them have to go through the entire site to find what they came there for. When you send all your links to the homepage, you are suggesting what page is the most important to search engines. This can often be seen where you have sites that never have any subpages ranking or showing up in search results, only the home page over and over. Sounds great doesnt it? But you can limit the exposure of your site in the long run. By directing links at your subpages, not only do you increase the chance that they will eventually rank higher, you also can get more specific with your linking text. Direct keyword linking to the home page is more risky. With your subpage links, you can get closer or even use exact keyword phrase links because the content is more specific. In reality, use whatever link phrase that a user will click on. A bunch of keywords might not do that job as well as a conversion statement. If you are worried about getting people to your home page, change the way the subpages are set up on your site, so that users can go to the other relevant areas of your site with ease. It should be easy and simple to direct them where you want them to go using graphics or styled text.

    | David-Kley
    0

  • I don't understand how we'd lose traffic...some visitors would see old site and some would see new site until fully propagated, right? The problem with changing DNS is an initial traffic drop as routers/hubs/ gets the update. Quote REF: http://www.mattcutts.com/blog/moving-to-a-new-web-host/ Step 3: Change DNS to point to your new web host. This is the actual crux of the matter. First, some DNS background. When Googlebot(s) or anyone else tries to reach or crawl your site, they look up the IP address, so mattcutts.com would map to an IP like 63.111.26.154. Googlebot tries to do reasonable things like re-check the IP address every 500 fetches or so, or re-check if more than N hours have passed. Regular people who use DNS in their browser are affected by a setting called TTL, or Time To Live. TTL is measured in seconds and it says “this IP address that you fetched will be safe for this many seconds; you can cache this IP address and not bother to look it up again for that many seconds.” After all, if you looked up the IP address for each site with every single webpage, image, JavaScript, or style sheet that you loaded, your browser would trundle along like a very slow turtle. If you read this page you'll see Matt Cutts tested mattcutts.com  himself and did not see any major impact. However, Matt Cutts has a high profile domain since he is well known for talking about his experience within Google. The point is the test environment works perfectly right now. If the files are migrated over to the live environment, then we could have issues. But if we simply switch the DNS to the test environment, we know that it will work fine. I would concede this point if the major updates are operating in a different test environment then the live environment. By environment I mean different server architecture, like different php / asp versions or database types/versions that the current live server can not or will not be updated to. When you create a test environment you generally want to duplicate the live environment so you can simply push the test elements live once complete.If the server architecture is part of the test then I can't argue with the logic.

    | donford
    0

  • Yea, I cleaned all that stuff up! This website was a mess. Anyway, back to the .htaccess, I have no idea what they are being used for.... but we are actually in luck! I was doing a little research and I came across something interesting.... Not only is that the original file that our old webmaster pulled from, but that's the updated version with 2 sets of entries commented out. Seems like a good place to start? Thanks! Here is the link, because it seems to not want to display properly. http://wordpress.org/support/topic/wp-super-cache-force-to-https

    | HashtagHustler
    0

  • Sure. It is www.wijss.com.

    | wellnesswooz
    0

  • Hi Sika, Your redirects from the .net to the .com site are 302 (temporary) and not 301 (permanent) and is likely the reason your .net urls are staying in the serps. If you get those switched over to 301 then with a little patience all your .net urls should be replaced by the correct .com ones. You can check what kind of redirects you have with various online tools like this one: http://www.redirect-checker.org/

    | LynnPatchett
    0

  • Thank you EGOL, all makes perfect sense and I appreciate your reply. I suspect the problems are mostly centered on the hosting issues, with secondary potential robots.txt issues aswell.

    | labelPR
    0

  • One thing you might be facing is Google tailoring the search results to what it thinks that you want to see. I would use Moz's rank checker tool to verify the results first, then act from there.

    | LesleyPaone
    0

  • Lots of interesting ideas.  Thanks you everyone.

    | ravashjalil
    0

  • You should have migrated the sold domains "authority" to the new site using Google Webmaster Tools. Doing this would have transferred all of your SEO to the new site. However, since this was not done (or 301 redirect each link to the new site), i'm sorry to say that you'll need to start over

    | responsivelabs
    0

  • You're going to want to use the .htaccess file to redirect www to non (or vice versa) and then also direct all http to https. Try this: RewriteEngine On #redirect non https to https RewriteCond %{HTTPS} !=on RewriteRule ^.*$ https://%{SERVER_NAME}%{REQUEST_URI} [R,L] #redirect non www to www RewriteCond %{HTTPS} !=on RewriteCond %{HTTP_HOST} !^www..+$ [NC] RewriteCond %{HTTP_HOST} (.+)$ [NC] RewriteRule ^(.*)$ http://www.%1/$1 [R=301,L]

    | MattAntonino
    0