Category: Intermediate & Advanced SEO
Looking to level up your SEO techniques? Chat through more advanced approaches.
-
How to deal with number swaps for organic results?
Nope, you're good to go. Ifbyphone is a good service. I agree with Spencer, you should be good to go as long as the javascript is working correctly.
| JasmineA0 -
URLs are not indexed
Hi Prashant, If these URLs are only created when a user searches for a term, but are not linked to anywhere in HTML either on your site or elsewhere on the Internet, this is a very good reason why they are not indexed. Google does not usually perform queries on a site (e.g. fill in forms) to "discover" what content might be displayed when those forms are filled in. It's tried and tested method of crawling is just that - crawling links and text in HTML. It has become more adventurous with different technology and sometimes finds things that it wouldn't have previously, but linking is still the primary way to ensure something gets crawled. In many cases, you wouldn't want Google finding content that it has to perform queries / fill in forms or searches to get to: this is how some sites create massive amounts of duplicate content by accident. So in a way, Google is doing everyone a favour by not indexing URLs like this. We have submitted these urls on google webmaster using a "sitemap" for indexing still none of them are indexed. I tend to think of sitemaps like road maps: they're a guide. The site itself is the road. If a map tells me that I can drive across a river but when I get there, this is no bridge, I'm not going to drive across the river. Maybe I will if I have a huge four-wheel-drive car with a snorkel, but possibly not. Maybe Google will index URLs it can't find on the web itself, but possibly not. If you want the URLs to be found, link to them Cheers, Jane
| JaneCopland0 -
Does it matter if the meta description and meta keywords come before the title tag in the
Actually, Bing has said that a bad meta keywords (spammy) can be a negative signal to them. http://searchengineland.com/the-meta-keywords-tag-lives-at-bing-why-only-spammers-should-use-it-96874
| KeriMorgret0 -
Does Link Detox Boost Work?
The question was about if Link Detox Boost works, not about using Link Detox. I do think Link Detox is a great program. Just Boost isn't.
| netviper2 -
Google and JavaScript
When Google puts out recommendations like this, they rarely lead people on a self-destructive path. If JS and CSS files could contain relevant information to help Google crawl or index your site more appropriately, then I say let them see those. Sorry I have no data to back up my position, but the articles you listed make a good case. I read a similar article weeks ago and unblocked JS and CSS from robots.txt, but I haven't really thought about this since.
| kwoolf0 -
Pros and cons of video onsite or youtube
CON: "if your video ranks, it will direct users to youtube, not your website - so there is limited SEO value from that perspective." I agree with this. Having your traffic directed to YouTube doesnt really do your site or traffic any favors, BUT you can include links in the description and on the video overlays to combat this. Vimeo Pro allows you to directly embed vidoes on your site, and not have the branded video player. PRO: Its free, and really easy to use/upload/delete/share/modify. A lot of people use YouTube. Seriously. Google indexes YouTube videos in search result pages Can overlay ads on your videos for extra revenue Vimeo is $100 a year iirc. Videos all under one account, and uses gmail account to manage/log in To be honest, I don't think it makes a difference. I would focus on spending extra time on your video, to make it more "conversion-friendly". Spend more time on your message, and what you want to communicate to your viewer. That way, it wont matter where you host it, people will want to come see you.
| David-Kley0 -
Canonical tag + HREFLANG vs NOINDEX: Redundant?
It depends a bit on your setup and how easy/difficult it is to implement the tags but a couple of things to have in mind: NOINDEX, FOLLOW should still mean the hreflang tags on the page are seen and followed even though the page in question is not indexed, the page needs to be parsed for the crawler to read the meta tag. If you are not facing some serious issue with crawling and your system automatically adds the canonical and hreflang tags then I would leave them as is even on the no indexed pages (you might for example want to start indexing the french pages but not the english pages of certain cat/brand etc combinations and the hreflang tags might help speed recognition of this kind of change to the crawlers). For the x-default my understanding is it is mainly for use in a multinational setup with default landing pages and or auto lang/region redirections and your url suggests your are aiming at Canada (and maybe USA) and only for en/fr so my understanding is it might not be really crucial in this case. Check out this page for some more details.
| LynnPatchett0 -
Microsites: Subdomain vs own domains
Unless there is a specific reason to keep the areas separate (e.g. you don't want people to be confused about the brand, or one activity is very inappropriate when paired with the other), it's usually best to keep the content on the same domain. brand.com/activity also usually looks more professional than exact-match-keyword.com. Furthermore, you often gain more trust when people see that the site's offering is well-rounded and includes a variety of activities. You can also benefit from accumulating good reviews from happy customers to just one website, rather than accumulating surfing holiday reviews for one of your travel sites, wedding holiday reviews for another and so on. Again, if you feel that the subjects clash badly and will create a weird user experience, you might look at developing different websites, but if you have a range of different activities / holidays that involve one location, I'd be 99% confident that these should all site on the same domain.
| JaneCopland0 -
Can cookies harm your webiste?
The instruction in the audit template is probably given to give you a 100% true view of what Google sees when it comes to the website, unobstructed by things meant for humans like cookies, javascript, etc. As Stephen says, Google traditionally does not accept cookies or execute javascript, or do a lot of things that are meant for usability (but may take some of this into account if it's being used for malicious purposes). This is not about whether cookies are "harmful" and you are not being instructed to turn off cookies on the website itself. You are being instructed to disable your browser from accepting cookies so you get an idea of what Google's experience on the website is like. Edited to add - posts like this are being shared, literally today, so keep an eye on the Google-accepting-cookies issue, but for the purpose of the site audit, it's just wanting to let you see what Google traditionally sees.
| JaneCopland0 -
Schema markup for video playlists?
If you won't want to adjust the technical implementation of the videos - I'd recommend just using an XML video sitemap, rather than Schema.org mark up to provide metadata about your videos. You can use the gallery_loc tag to account for playlists.
| PhilNottingham0 -
[Need advice!] A particular question about a subdomain to subfolder switch
Thanks a lot for the answer SMG. I really appreciate it. The one thing that is special about our case is that we actually only need organic traffic on the individual apps not on the root domain. Therefore we mostly care about the strength of the individual apps. I simply assume that a lot of people will link the landing page (rootdomain.com). So, with a subfolder system I hope to get more juice from the root domain to the apps and also that app1 for example can better support app2 with a better distribution of the root domain strength. Well, I guess there might be no other way than to try it. If anyone else has another opinion I would love to hear it before I make the changes tomorrow.
| ummaterial0 -
I want to block search bots in crawling all my website's pages expect for homepage. Is this rule correct?
some great answers you can also find a list of all the robots here & here Depending on your site you can also for example hide the rest of your site behind a login screen or a form which bots won't fill in.
| GPainter0 -
Any recommended affordable link removal services?
A DIY approach that uses a custom tool is http://www.rmoov.com/index.php, which many people have used with success. Regular Moz contribute Sha Menz writes about this tool (she works for the company that makes Rmoov) here, and also shows what Remove'em (another tool) does. I agree with Oleg in that thread that there is often little difference in they way different services handle link removal, besides price and initial approach, but I'd go further to say that a standard, tested follow-up protocol (how long until you you send follow-ups? What language is used?) is also important if you are going with an agency. Many if not most SEO agencies will provide this service, but may not be particularly economical depending on the agencies' business models. Many hope to provide on-going services to clients, or at least services that extend beyond one link removal campaign.
| JaneCopland0 -
What Happens If a Hreflang Sitemap Doesn't Include Every Language for Missing Translated Pages?
Hi Kyle, I would probably only include the URLs that have been translated in the hreflang sitemap, unless the English content that hasn't been translated can be assigned to one territory / language, e.g. en_us or en_gb, etc. If the pages that haven't been translated are otherwise linked-to throughout the site and are present in a regular sitemap, they will be found. I am not 100% sure that leaving them out of the hreflang sitemap is the way to go, so I will leave this question open for other people to reply as well. However, I'd say that if you wanted English language content to serve internationally because it has not been translated, it would be a mistake to target it to one English-language speaking area via hreflang.
| JaneCopland0 -
What to do about similar product pages on major retail site
Hi Ryan, I guess the first point here is that Google doesn't treat this sort of filtering as "penalisation"; it's just filtering two or more versions of the same content because it believes (sometimes mistakenly) that users don't need to see two versions of the same thing. This gets REALLY tricky in fields like real estate when all the aggregators in the same town have access to pretty much the same feeds or properties. If Google were perfect, you'd put up the two pieces of identical content for all 55 millions products, and Google would serve the right one given the appropriate query, like the example above ("fridge sale san antonio" brings up the local page; "refrigerator" has your main site rank). And this might happen, because Google is getting better at these sort of query-appropriate results. We still recommend not providing dupe content solely because we can't be sure that Google will get it right. As an aside, it would be so great if they worked on a tool for localisation in the same way that they have given us the href lang tag for internationalisation. rel="city" or similar would be awesome, especially for big countries. Your idea about serving the content from a shared source will certainly work (iframe, text hosted on separate URL, JS etc.). The pages serving this text clearly won't be credited with that text's content, which removes its SEO value of course.
| JaneCopland0 -
Uncontrollable Spammy Backlinks - Disavow or Not?
Thanks for your response Marie, That is very helpful!
| evan890