Category: Intermediate & Advanced SEO
Looking to level up your SEO techniques? Chat through more advanced approaches.
-
Does Googlebot Read Session IDs?
Safest bet, set up canonicals that point to the page minus the parameter so even if Google does read the session IDs it will understand that they relate to the canon link. Honestly, I'm not 100% sure if Google reads those sessions IDs or not either and have seen conflicting information. I know they read other parameters as separate URLs... I had a few issues with the way one of our sites handled products (sometimes it was ?model= and sometimes it was ?prod_id= and some old products also had ?sku=). But adding the canonicals will solve this problem if it exists and if the problem doesn't exist it won't hurt having a self-referential canonical sitting in the code in case someone scrapes your site.
| MikeRoberts0 -
Product Colours change on ecommerce store... similar descriptions.
I agree with Chris. In the specific example of coffeemakers, I would run a simple usability test (i.e. via http://www.usertesting.com). Have your visitors shop both ways and see which they prefer. Personally, for things like coffeemakers or even clothing, I have an expectation that if other colors are available I am going to be able to select that color from one drop down menu on a core product page Chris's point about potential cannibalization issues is a very good one too. Hope that helps!
| danatanseo0 -
URL Parameters Duplicate Page Title
Hi Paul! Is there any way to know whether the GWT parameter directive is working? Since only Google has that information, do they have a way to cross-check that the directive was correctly given and executed? As an aside, I will be really pushing for canonical tags when I meet with our MIS department, but I have been warned that I may not be successful. Why would that be? I must add that we have an old, rickety site authored in SharePoint 2010. My job is to do the best I can and cross my fingers we upgrade to SP 2013. Thanks, Sarah
| SSFCU0 -
Penguin 2.0 update, ranking dropped. Advice needed!
Yeah, that's certainly painting the picture that this probably was Penguin and not just a coincidence. Best we know so far (and that's very speculative), Penguin 2.0 didn't introduce much in the way of new ranking factors. That is to say - it's not look at different things, it's just looking deeper in sites and probably being a bit less forgiving. So, the same rules for recovery probably apply to Penguin 2.0 that did to the original Penguin. Unfortunately, that often means deep cuts to any questionable links and really focusing on quality. If you can't remove the links, that's going to mean disavowing, and even once you've disavowed, you may need to wait for a Penguin data update. Recovery stories have been hard to interpret, even for some very high-end SEOs I know who have been through the process. The one consistent story we've heard is that, if it is Penguin, you can't just remove a few links disavow, and keep hoping for the best. It may take pretty drastic measures, and you have to weigh those measures against the loss you took. In other words, you have to figure out if the cure is worse than the disease. Sorry to paint a bleak picture, but Penguin is pretty harsh.
| Dr-Pete0 -
Will aggressive use of branded keywords in anchor text attract Penguin’s wrath?
Honestly, given that you currently have relatively few cities, consolidating might not have a big impact. I'd be cautious going forward, if you were going to expand into hundreds of markets or started having a ton of results under every market, but for now it probably makes more sense to improve your link profile and try to get some content on the site that isn't entirely search results.
| Dr-Pete0 -
Meta No INDEX and Robots - Optimizing Crawl Budget
Are you still linking to those pages? If so, I would just keep the noindex,follow meta in the header. That way you still benefit from the link juice flowing to these pages, link juice that would be lost if you just blocked them from being crawled via robots.
| TakeshiYoung0 -
Rel=canonical an iframed version of the same website?
This is weird. Why would you want an iFramed version of a website..? Also, what difference does it make where the employees email addresses reside? They could be at dogsled.com for all anybody cares! I agree with Chris yet again. 301 that bad boy and be done with it. REL=CANONICAL does NOT pass link juice. So definitely don't use that as a fix. It would help with the duplicate content issue and prevent any Panda penalties, but a 301 would do that while passing link juice... Maybe you could elaborate on why a 301 is out of the question?
| jesse-landry0 -
Trouble ranking
We have not gotten any warnings from webmaster tools. All of these links were on the site before we started. And the number of poor links like that are fairly minimal. They do not have a ton of links to begin with and I do not imagine directory links, which were valid in the past, would harm them like this. The links are not in the form of blogroll, paid links, irrelevant sites, link lists, or anything like that. We have been building them links through guest blogging and local university discount programs.
| Atomicx0 -
Penguin Update, what I've noticed
I think alot off people are not taking into account about the penguin 2.0 update its site that are backlinking to you might be devalued. So you look at your own backlink profile and you think its ok, but the backlink profile of the site that backlink to you might have hand there authority reduced (maybe even penalised). And then what about the backlinks to the backlink of the backlinks ( backlInkception) My under standing is that penguin 2.0 is only about backlink profiles (like 1.0), and everything else is speculation at this stage without a lot of research. I'm not saying what your have noted is not correct, just that there are far to many variables at this stage to come to any conclusions and some bigger research needs to be done on it. "so we are focusing now on links to home page" I think that might be a good idea, more links to the homepage would look more natural IMO
| PaddyDisplays0 -
Someone copied my content and it's ranking higher than mine!
@Tim I think what you stated can be one of the reasons. We have good following on our company page but not on personal profile. Probably this can be one of the reasons. We will work on implementing this asap. @Paddy That's a great idea. We found out that the other person is hosted with godaddy. I hope they would help us to get this thing sorted out. @Mossa - thanks a lot for those great tips. Can you please let us know what do you exactly mean by " earn some quick links by Outreaching strategy"? Thanks a lot all for the great response. Regards Raunek k
| marineinsight0 -
Large number of pages crawled.
As Takeshi noted, you've got a lot of pages like this: http://printlabelandmail.com/home/answer-4-questions_06-2/ (look, you just earned a link. I don't do that for everyone. I'm not certain if that is actual content, but it's likely not what you want in terms of user experience. I'd search for where those pages are being generated from and try to root out the cause. Also, because you are using Wordpress, I highly recommend using Yoast's Wordpress SEO plugin. http://yoast.com/wordpress/seo/
| Cyrus-Shepard0 -
Is it better to not allow Google to index my Tumblr Blog?
If it's on a sub-domain, it's not affecting your main domain. If it's part of your marketing mix, I'd still continue to use it and see if it makes sense. If not, at some point you could possibly get rid of the custom domain setting and still let it sit there. Is it unique content that you have there ? If there isn't enough engagement, is it the kind of content you are publishing ?
| NakulGoyal0 -
How do I presuade Google to re-consider my site?
You got a manual link penalty i think. I have dealt with such penalties 3 times now (2 turned successful and 1 is in the go). Here is what you should do: 1. Get rid of spammy links (low quality and medium quality too) When you're manually reviewed, even the good links can become bad (algo can't detect them, but humans do) 2. Use disavow tool to ignore the remaining medium/low quality links you couldnt delete ( and the one you deleted too). Try to disavow domains either than simple urls. Struggling in taht penaly is worse than over-disavowing links, as it's easier to build ne links again. 3. Build a link asset onsite and try to grab some high quality and natural links (this should be done before reconsideration request). If your site is deserves to rank high, it should still receive High Q links even if it's not showing in G results.This sounds logic, right ? 4. Wait 4-6 weeks after submitting the diseavow file to google, and then send reconsideration request. Bear in mind, you wil be most likely denied for the first and 2nd request like to 80% of the times, but the 3rd one can turn to be successful more often. So you have to be patient. One last piece of advice: if recovering your website doesnt pay off the amount of work, time and money you will invest, just throw it away and start over with a new domain. And if you need someone with experience to handle the problem and take a closer look (audit) on your link profile, don't hesitate to call me back. Wish you good continuation
| rikano0 -
GWT Crawl Error Report Not Updating?
Thanks for the tip, Jacob! I was indeed back in business about a week later. I will keep seroundtable.com on my quick list for any such situations in the future.
| tonyperez0 -
Novice Question - Can Browsers realistically distinguish words within concatenated strings e.g. text55fun or should one use text-55-fun? What about foreign languages especially more obscure ones like Finnish which Google Translate often miss-translates?
Omid, Thanks very much for the fast response. Totally agree about about Google AI - my objective is completely white hat, original content-rich website which their AI encompasses - Google is attempting to weed out folks who are avoiding the hard work associated with building these types of websites. Not everything in business has a brand-building goal - only so many brands can rise to the top. I'm looking from a very different perspective in this case and would be very happy to find my 1,000 to 5,000 specific people or so. From that point, I would then have my work cut out from a business standpoint, nothing to do with SEO or anything related to this field. Just simple blocking and tackling - sales,customer service, marketing, delivery etc. I'm an old-school guy (sell high-value, high-margin products and services to customers who value you and want to stay with you over long periods of time because of great customer service) and this is an experiment for me using a new-school sales tactic. In my experience it does not take a lot of those types of customers to build a very nice small business. The really hard part is actually creating the organization which genuinely delivers on those commitments consistently over long periods of time and retains those customers. All the best Newell
| ny600 -
Is it worth buying a GREAT domain with rank?
OK... you say that you have.... -- GREAT domain -- Great rankings -- Valuable keywords -- Tidy link profile -- horrible website That gets my blood pumping... I would be sniffing this site to see if I can do these... -- improve the SEO and triple the traffic -- redesign the site and make it twice as easy to use and double the conversion rate -- create new content to make it more credible and go after new keyword that will double the traffic -- reformat the product pages and double the income -- so, after you triple the traffic from SEO, double conversion rate, double the traffic with new keywords and double again with better product pages.... this site will be making a fantastic amount of money This, to me sounds like a candidate for a flip. A 301 redirect might be selling a champion racehorse for horse meat. If this site was under my nose I might be grabbing it and taking away the sales from everyone else in that niche. (No guarantees on the above... depends upon the attributes of the site you are starting with, your skill set and how hard you are willing to work)
| EGOL1 -
Does Yahoo Directory Listing Pass Authority with PA:0 and 0 links from 0 Root Domains?
the most likely explanation is the moz's bot had not crawled that page ( is too deep), to give it a value. Note: google bots crawls a lot more than moz's bot and so it will give that page value, its easy to test, google the url and see it its indexed. Does it pass any authority is a different question. I have heard mixed opinions about the yahoo directory, some saying its worth the money, other saying its not. Personally and currently I would not bother paying for it, but I would not tell someone they shouldn't get it. At my last job I did not renew the link one year and it did not seem to have any effect on the site's serp (but I cannot prove it did not) If you search the forums you will find alot of topics about it.
| PaddyDisplays0 -
Redirect Chains - Accept the 301 chain or link from the original page??
Thought in general you could use canonical tag cross domain too http://moz.com/blog/cross-domain-canonical-the-new-301-whiteboard-friday
| Houses0 -
REPOST: How much does "overall site semantic theme" influence rankings?
Hi I'm not entirely sure, but I think I can see what you're getting at. You're used to working with sites focussed on a particular niche, but now you have a site to work on that encompasses a variety of categories. "Does this influence the ranking possibilities?" - it's the same: you can rank each page of a site for its target keywords with relevant content, good site architecture and link authority. Think about large sites that rank for a huge variety of terms - e.g. Amazon. There's a good post on Site Architecture here, and another one more specifically about "content hubs" here. I hope that's helpful and that I've understood your meaning correctly
| magicrob1 -
Does it make sense to buy a domain to be used in a year or so?
Thanks, Kane, will do that!
| wspwsp0