Questions
-
Can I use a 301 redirect to pass 'back link' juice to a different domain?
It definitely passes link juice, one of my extortion activism groups recently was attempting to stop an individual from monetizing a domain that publicly shames people. What the black hat hacker had done was identify dead links on a NY Times article and wanted the DoFollow links from such a authority The individual purchased the domains and redirected them to his site and luckily we caught it on a backlink check and one of the group members was a journalist that was able to get them to remove the links.
Technical SEO Issues | | TucsonAZWebDesign0 -
In this situation, should I consolidate two pages into 1 for stronger SEO?
Hi DAGU, This is an interesting question and one that requires research. Google rank brain will identify words intent that are essentially the same. You can see this with keywords by typing them into the search engine. If you type in 'How much is a haircut' in bold you'll see in the entries to the SERP 'haircut prices' 'average costs' haircut cost 'hairdresser cost' etc. So you know google considers these to be the same things. So if you mention 'price', 'cost', 'average price' loads of times you're going to get slammed for keywords stuffing because these are variants on the same words. Now with products it's a little more difficult but the same principle applies. You need to find out what people (and google) considers to be in the same topic and where the edges of that topic are. This takes testing and time and research. But the best place to start is with googling. On the highest ranking pages do they group the products together or have them on separate pages? Also use your brain (I don't mean that in a derogatory way) but shut your laptop and think critically about what your customers want to see on each page and what type of customer might be looking for what product and why. Even better go ask them! But looking at successful pages will give you a good start. Make sure you have your Mozbar on and discount the pages with very high DA or PA because they may be ranking because of factors not related to on page or smart information architecture but because they are just established players. You want model the small players who are newer and doing it right. There is basically a balance to be struck between the serp entry (very important for people looking for something specific) and the page content which needs to have the depth and comprehensiveness to rank. And there's no tool out there except your own brain and your customers/staff that's gonna give you those answers. Also don't be afraid to test. I treat 'gummy smiles' with Botox so had that as part of my botox page. I just took it out this morning and made a page dedicated to gummy smiles that includes laser gum contouring, crown lengthening and all sorts of other stuff. Let's just see what happens. You can always change it back if it doesn't work. But be aware of volume. If you've got one page for the "Nobel Biocare Straumann Titanium Implant" it's just not gonna pick up enough search and google's not going to get enough data from chrome to rank it. So I have it as a tab on my Dental Implants page and if someone does a search it it's marked up & H3 so it comes up as a hyperlink in blue. Good luck. But this is a critical thinking job and a research job so get googling and see what google considers the delimitations of your topics. Spending two days just googling products and making notes will not be time wasted because you'll get a feel for it. And once you start to get a feel for these things then you can start using your intuition and taking shortcuts. Most of the stuff I do now I just wing it and do a few tests and pick a winner. Because i've done so much painstaking research I feel I have a pretty good idea of what google's got in mind for my categories and topics.
Technical SEO Issues | | Smileworks_Liverpool0 -
Site with 2 domains - 1 domain SEO opimised & 1 is not. How best to handle crawlers?
Hello! I can answer this from a Google / SEO perspective (a non-moz tool perspective). First you want to be sure the secure subdomain content is not indexed. If the secure subdomain is NOT indexed, leave the robotos.txt crawl blocking in place. You don't want and don't need Google crawling secure pages and payment pages. Just be sure they truly all are private pages. If they are NOT indxed, the crawl block is best - this will prevent google from crawling, and if they can't crawl they can't index. If the secure pages ARE indexed remove the robots.txt crawl block. Add meta noindex on all the pages Wait for them to be noindexed (removed from google) Then, block them from being crawled with robots.txt - which will prevent them from being crawled, and thus prevent them from being indexed as well.
Getting Started | | evolvingSEO0 -
Noindex follow on checkout pages in 2017
Hi Ironically MOZ will pick this up as a problem as it reports anything that is noindexed! For me I just ignore noindex as a problem in certain cases as clearly it makes perfect sense to noindex certain pages and indeed sometimes whole directories. I sometimes find that developers have noindexed directories like /new-products or /sale but clearly there are better ways of handling the potential duplicate problem here by adding a canonical. In you case it makes no sense having Google index the checkout pages. Regards Nigel
Search Engine Trends | | Nigel_Carr0