Category: Intermediate & Advanced SEO
Looking to level up your SEO techniques? Chat through more advanced approaches.
-
Ecommerce Site - Duplicate product descriptions & SKU pages
Hi, Yes we use the SKU's on the other pages too. Thanks for everyone's feedback
| BeckyKey0 -
Location in URLs question
Hi, I do not think it will hurt your SEO as the primary city of the story is "Denver" and it's ranking well there. You can put up a small editorial note on the top like Moz does to show people it is primarily related to Denver and how it applies to other cities or you can use cities breadcrumbs in URLs. Hope this helps! Umar
| UmarKhan0 -
Should I delete 100s of weak posts from my website?
This is a very valid question, in my opinion, and one that I have thought about a lot. I even did it on a site before on a UGC section where there were about 30k empty questions, many of which were a reputation nightmare for the site. We used the parameters of: Over a year old Has not received an organic visit in the past year We 410d all of them as they did not have any inbound links and we just wanted them out of the index. I believe they were later 301d, and that section of the site has now been killed off. Directly after the pages were removed, we saw a lift of ~20% in organic traffic to that section of the site. That maintained, and over time that section of the site started getting more visits from organic as well. I saw it as a win and went through with it because: They were low quality They already didn't receive traffic By removing them, we'd get more pages that we wanted crawled, crawled. I think Gary's answer of "create more high quality content" is too simplistic. Yes, keep moving forward in the direction you are, but if you have the time or can hire someone else to do it, and those pages are not getting traffic, then I'd say remove them. If they are getting traffic, maybe do a test of going back and making them high quality to see if they drive more traffic. Good luck!
| dohertyjf1 -
Pagination parameters and canonical
Hi Teconsite, this is a great question. I would not recommend marketing the "p" parameter in Search Console. Instead, I'd leave it as "Let Google Decide" and use your pagination SEO implementation to guide the search engines. There is still a lot of debate around pagination as it relates to SEO. The way I have always implemented is is: Every paginated page canonicals to itself, because you do not want the search engines to start ignoring your paginated pages which are there somewhat for users, but also for SEO. Use rel next/prev to help Google understand that they are in pagination, which will also help them rank the beginning of pagination for the terms you are trying to rank for. Use noindex/follow on pages 2-N to be sure they stay out of Google's index. Use the numbers showing how long pagination is to drive the search engines deep into your pagination to get all of your products/whatever indexed. This is often done through linking to page 1, the last page, and the 3-5 pages on either side of the page you are currently on. So page 7 of 20 would like to page 1, pages 5-9, and page 20. The reason most people say to canonical pages 2-N to the base page is to preserve any link equity pointing to these pages and help the first page rank. However, I have almost never seen a deep paginated page with links, and if you have architected pagination correctly then the equity going into pages 2-N will also flow to page 1, just like product pages linking to category pages. Hope this helps!
| dohertyjf0 -
International Sitemaps
Thanks man! That's what I was thinking! I'll go with the original plan then to put all on one index except the ccTLDs.
| blake.runyon0 -
SSL for SEO?
No difference for SEO; main difference is the Green bar which is displayed for Extended SSL certifcates - these are the ones which tend to be more expensive than the "standard" ones - on top of that they don't allow wildcards - so you'll need a certificate for each subdomain. Could increase confidence of your visitors in your site - but as stated before - no direct SEO impact. Dirk
| DirkC0 -
Repeatedly target a rolling list of kws..or is that cannibalization? Biggest Confusion in SEO Ive found
Ricky, It's impossible to get around using the same and/or similar phrases in multiple pieces of content on a website. As long as your writing the content your audience craves, this should be of little concern. To become an authority on/in a given topic or area, you will need to create a deep collection of similar but varied content. In a nutshell, your worries about cannibalization in this sense, are unfounded. RS
| ronell-smith0 -
Crawl rate drop
Hi there. Of course it can! The crawl rate is in pages per day, so if you remove pages (especially 1.5 million), there won't be as much to crawl Also it can happen, if you have the same static pages and the crawl bot has crawled them all. Google doesn't crawl all pages all the time, they have limited resources. So, if you have launched or updated website recently and now not really updating it, you can see the change in crawl rate. However, you can change crawl rate for 90 days, if you need them to crawl your website constantly (usually good for websites in process of reconstruction). Here: https://support.google.com/webmasters/answer/48620?hl=en
| DmitriiK0 -
Multiple Landing Pages and Backlinks
Say I was going to fully build out these pages with content, how much unique content should I have per page (minimum)? Would several paragraphs do the trick?
| shauna70840 -
Migration Strategy
This temporary set-up idea makes little sense to me... migrations are tough enough anyway, and the temporary middle stage is increasing both risk and complexity.
| McTaggart0 -
NAP - is lack of consistency in address elements an issue?
Sounds like a good plan, Luke! Good luck with the work, and be sure the calendar is crawlable
| MiriamEllis0 -
Two blogs on a single domain?
The URL structure is mostly subjective to me - there's not a clear value between the two. I tend to think of services as a subset of each store location, as opposed to the the locations being a subset of the service sets. So - if it's easier to build the URLs the way you laid them out already, that's not an inherent problem.
| KaneJamison0 -
Schema.org problems (still)
I haven't had any trouble using JSON-LD for organizations and people. I have only used in-line product schema so far for products. You could try adding the in-line markup if you haven't already, and leaving the JSON-LD script up.
| Everett0 -
Canonical Question: Root Domain Geo Redirects to SubFolder.
Thanks for the tips man!
| blake.runyon0 -
Shutting Down a Domain that Has Multiple Office Location Pages that will be Rebranded
a) Do not take anything Google says at face value. b) He should look at the brands that are ranking in his own algorithm. c) I guess he would suggest that all companies remove store locators from their websites.
| David-Mihm0 -
Tidied up site by getting rid of bad pages and now rankings tanked. - Please help
Thanks Craig, We are basically a tool hire website but previously as also had a number of mini sites which just specialised in one aspect of what we hired, so for instance, we had a carpet cleaner hire website and a generator hire website as well. This helped to give us a larger SEO footprint and until the start of 2014 was proving very successful. By we got a penalty(due to some links from myguest blog which an seo freelancer did for us ) and this took a few months to clean up and we also then stopped the microsites for saftey as well. I will check the traffic from the 301 domains and see if I can remove them. thanks pete
| PeteC120 -
Partial Match or RegEx in Search Console's URL Parameters Tool?
No problem Hope you get it sorted! -Andy
| Andy.Drinkwater0 -
Changing URLs from sentence case to lower case
Just in case later in the future you want to turn all your URLs to lower case you can do something like this In your .htaccess file insert this ensure it is not a file on the drive first RewriteCond %{REQUEST_FILENAME} !-s RewriteRule (.*) rewrite-strtolower.php?rewrite-strtolower-url=$1 [QSA,L] Then in your root directory place a file called rewrite-strtolower.php and insert if(isset($_GET['rewrite-strtolower-url'])) { $url = $_GET['rewrite-strtolower-url']; unset($_GET['rewrite-strtolower-url']); $params = http_build_query($_GET); if(strlen($params)) { $params = '?' . $params; } header('Location: http://' . $_SERVER['HTTP_HOST'] . '/' . strtolower($url) . $params, true, 301); } exit(); ?>
| cbielich1 -
301's - Do we keep the old sitemap to assist google with this ?
Awesome , Many thanks All !!. Much Appreciated Pete
| PeteC120