Category: Intermediate & Advanced SEO
Looking to level up your SEO techniques? Chat through more advanced approaches.
-
Should you use a canonical tag on translated content in a multi-language country?
Hi Aleyda, Thanks for your answer and thanks for the links. As written in the description everything will be translated, so also the title, desc, comments etc.). So we don't have to worry about anything "everything is gonna be alright" (Bob Marley) :-). In addition the hreflang annotations are a good way to communicate with Google about what is what Thanks! Best regards, Wesley
| Zanox0 -
Howcome Google is indexing one day 2500 pages and the other day only 150 then 2000 again ect?
Hi Dana, Thanks for your detailed explanation. Appreciate it Off course I understand that site speed is a factor for crawling (+ ranking) and that the Google bots only want to spend a certain period of time on a website. It's more like, when servers are performing almost equal every day so page loads are igual to, what could it be? I agree with your two points of considering, but I'm the type of guy that always wants to know why something is happening @Nakul: Thanks for your responds! The pages that are in and out of the index are mostly product pages. So the thing about "frequently updates" can be something. The website is pretty young so authority is not yet build as it should be for a big site. This can also be a factor cause the more authority the more time Google will spend indexing a website right Anyway, great thanks for both of your answers! Gr. Wesley
| Zanox0 -
Why are these m. results showing as blocked?
Yeah, I was testing exactly the same when you posted the response. I even tried crawling a googlebot-mobile and still I get the 301 redirect. Which, for everything that I am seeing it is correct, as no matter what browser I use (desktop, mobile, spider) I always get a 301 to the www. version. @michelleh, are you sure there's a mobile version not being redirected to the www. one?
| FedeEinhorn0 -
Creating 20+ websites with links back to central site
I would agree also. Creating Sub-Folders is the best way to go. I would add just a few quick points: 1. Include schema.org geocoordates to your pages: http://schema.org/GeoCoordinates 2. Sign up for Yelp if it is applicable for your client's business locations. 3. Make sure your Meta Description for each page mentions a city/town name. 4. Pretty simple one but more related to the site overall - Setting Geographical targeting in Google webmaster 5. Sign up for Google+ Local (mentioned above), Bing Places and Yahoo Local 6. If your business is related to Yelp, your most likely able to handle other review sites. Submit the business to reviews sites which will help generate more exposure for the brand. The reviews can also be used in Google - increase CTR% (if you have good reviews) Good Luck!
| Goulart0 -
Local and Organic Listings
You are very welcome, HippieChick. Glad this helped to clear up a big question at your office!
| MiriamEllis0 -
Duplicate Title - Magento Products / Kunena Forum - Nofollow vs. Follow
Nicholas, we were just discussing this question over here--it's worth taking a look at: http://seomoz.org/community/q/forum-website-rel-nofollow-is-this-good
| Chris.Menke0 -
Renaming your domain from an existing live domain and SEO implications - Please Help *shudder*
Hello You may also want to try rel="canonical" (https://support.google.com/webmasters/answer/139394?hl=en) or find out some better services for redirection (http://en.wikipedia.org/wiki/URL_redirection#URL_redirection_services) Also, and anyway, if you would like to keep your page rank, domain authority, and url, i would also suggest that you can just re write (by copy and paste) the whole best website code and structure as well as content, and have it all on the old domain. It seems to me the best solution, at least unless its not to hard to re create the website.. hope it helped eugenio
| socialengaged0 -
Redirecting a working dynamic URI to a new static format
This is what I would do: .htaccess Options +FollowSymLinks -Multiviews -Indexes RewriteEngine On #This line is for the categories RewriteRule ^category/(.*)$ index.php?l=product_list&category_name=$1 [L] Note: you will need a new rule for each "type" of page. That was an example for the categories only. Then you will need to add some PHP code to read the $_GET["category_name"] variable, which should be text (like banner-stands) and get the category id from your database to later load the products. You have several ways to do that, for example: insert an extra column in the DB with the category name "dashed" and search the category id using that new field. You should then change your internal linking to use those new URLs instead and add a rel=canonical pointing to the new versions of the URLs. Don't 301 all others until you have the above solved and working. Make sure you make everything safe, as with that structure something like index.php?l=product_list&c=1&sort=asc won't work anymore unless you create extra rules for that, same as with pagination, if you have any. Hopes this gives you an idea on where to start. ***You are not redirecting dynamic to static urls, you are just making them friendlier
| FedeEinhorn0 -
Page is noindex
As we have 10K+ pages indexed already we set as Noindex and expect the search engines to drop off those pages after which we can block via Robots. if i do the indexed pages will not be dropped right? Also URL removal request for 10K pages 1 by 1 is going to be tedious
| mtthompsons0 -
How to know when do use singular vs plural in anchor text and on-page copy?
A lot of results for singular/plural and synonyms are so similar as to be nearly identical for the first page or two, which is what really matters, and which is what Gregory Baka is referring to. You will notice a lot of times if you search for something you'll see synonyms and variants bolded in the description and title in the SERPs. That would be your signal that one is being treated as synonymous with (though not "identical to") the other. In terms of singular vs plural I tend to include both variations naturally within descriptions and on-page copy. External links tend to contain both versions too unless you're buying the anchor text. I would think, based only on common sense and experience, and not any quantifiable study, that Google looks for a natural variation. If you have two different landing pages, one targeting singular and the other targeting plural, that would not only be wasting effort, money, link equity, etc... but it would seem very unnatural. If I were writing an algorithm I'd probably figure out a way to push such pages lower in the results unless other signals point to really high quality at the page and/or domain level. ALL of this "common sense" stuff flies out the window though when any ambiguity of intent or results is involved. For example, with "cars" you could be talking about the animated movie, which is why you see IMDB, Disney and Wikipedia in the results. This disambiguation factor is why Google is pushing for semantic markup of the web, and is probably why topic modeling has become increasingly important (e.g. want to rank better for "cars" when the user intent is to find the animation, use words like "Pixar" and "Lightening Steve McQueen" in the copy). As a rule of thumb, I tend to go with whatever sounds better and makes more sense to the user. For example, on a category page I might write "blue widgets" in the title, but I'd use "blue widget" on a single product page. From there I go with what the data says. Looking at Analytics a few months later I pay attention to traffic and keywords as a follow-up. If the "blue widgets" category page gets 80% of it's traffic from a #3 ranking for "blue widget" when it ranks #1 for "blue widgets" that tells me I should probably change the title to the singular version. In the end I usually find I get the best results when I don't think too hard about it and just go with my gut when writing. I know that's not scientific or anything, but if it works it works.
| Everett0 -
Getting a Home Page to Rank
Thanks Vadim, that does help. I'm thinking the accordion may not be the best for my site, but I will look into other hide-until-clicked-to-reveal snippets.
| Travis-W0 -
Can we retrieve all 404 pages of my site?
The 404s in webmaster tools relate to crawl errors. As such they will only appear if internally linked. It also limits the report to the top 1000 pages with errors only.
| matbennett0 -
Keywords loosing positions whats best can be done?
Ah, thank you for the clarification. In that case, here are somethings you do: Review the on-page SEO of the articles and see if they are fully optimized for the keywords. If not, then optimize the pages, but try not to fall into the trap of over-optimizing. Update the articles and republish them. Freshness can affect rankings. Create a new version of the article and publish that. I'm not talking about spinning, but rather if it's been a while since the article was written and perhaps there are some new things that could be added or old things that should be removed, you could create a new version. In fact, in some situations, you may have articles that can be redone every year. Try to build some links to the articles. Share the articles on social networks and with bloggers in your niche who might find the articles interesting and useful. Kurt Steinbrueck OurChurch.Com
| Kurt_Steinbrueck0 -
Webmaster or analytics can we find pages that are 404
Weirdly enough, I've just been answering the same point in another question: http://moz.com/community/q/can-we-retrieve-all-404-pages-of-my-site The link above has a few more options, but this bit is the most directly relevant to your question: Analytics : As long as your error pages trigger the google analytics tracking code you can get the data from here as well. Most helpful when the page either triggers a custom variable, or uses a virtual url ( 404/requestedurl.html for instance). Isolate the pages and look at where the traffic came from.
| matbennett0 -
How to add subdomains to webmaster tools?
Hello Samuel, I assume you mean http://blog.mysite.com instead of www.blog.mysite.com. You will need to verify that subdomain in Google webmaster tools and upload a separate sitemap just as if it were a different site. Yes you can have more than one sitemap on the same domain. However, a subdomain is considered a separate domain.
| Everett1 -
Login required pages that redirect back to the post
No follow links to all login pages and noindex meta tag all login pages. Just keep those login pages etc out of the index.
| CleverPhD0 -
How does google recognize original content?
Some believe that the code of your website is taken into consideration by Google. This basically implies that duplicate content only applies to the creation of multiple blogs all coded the same with the same text. This was a tactic used by many using automated software. This is just a rumor and from personal experience, movie news blogs and website tend to churn out identical news stories including pictures, video and text. I have not seen any of these sites being held back in their rankings.
| FlashBangSEO1 -
Manual reconsideration request not going away.
Hi TeamSEO, Sorry to hear about your difficulties. Without doing a full site audit and backlink audit, it would be challenging to determine exactly why you've fallen out of good graces, but a couple of things jump out at me from your description of the problem. 1. Backlinks: You received a message of unnatural links, which is a good indication of a penalty. In these cases, the Webspam team has indicated they like to see evidence that you've taken steps to remove the links, and not just submit a disavow file. I'd do a complete link audit of your links in GWT, determine which links are no longer there and what links you can successfully have removed, and work on those. Document all your work, then update your disavow file only with the links that you can't successfully manually remove, then submit all this documentation to Google. 2. The letter from Google seems to hint that the problem may not be your backlinks, but some other quality guideline violations, like selling links or hidden text or doorway pages. Looking at your site, it seems to be pretty clean, but sometimes problems stay hidden. Refer to Google's quality guidelines for a list of things to look for: https://support.google.com/webmasters/answer/35769?hl=en#3 As a last resort, you can ask Google, via reconsideration request, for a list of example pages. They may or may not honor this request, but it's worth trying for. Unfortuneatly, the Webspam team isn't known for giving out tons of information, but if you're nice enough and persistent enough they may give you a clue. Also, I highly recommend you post this question on Google Webmaster Forums, and explain what you did here. Often times a Googler will jump in on those forums and offer to help. http://productforums.google.com/forum/#!forum/webmasters Hope you can get to the bottom of this. Best of luck!
| Cyrus-Shepard0