Questions
-
Where does rel=canonical go? One file that manages sort order, view, filters, etc...
If we are using dynamic coding so that the search results page is just one page of code for ALL cities, then I assume that we need to just add dynamic logic at the top of the page for the tag to be dynamic based on the city URL in the URL, right? Then when this one page loads for any city and any url parameter, it will just be custom, for example: URL: domain.com/tx/austin?urlparameters..... LINK: domain.com/tx/austin URL: domain.com/ca/los-angeles?urlparameters..... LINK: domain.com/ca/los-angeles Is that correct? This will just be dynamic logic in our codebase on the one template?
Technical SEO Issues | | ErnieB0 -
Keyword Phrase in URL structure
Hi Rebecca - Thanks for the fast reply! In my example, how would you structure the "find a business" URL's vs. the "specific business location page" URL: website.com/anxiety-treatment/co/denver to browse a directory to allow users to work backwards to find a location in another city website.com/johnson-anxiety-treatment-center-denver-co as the link to the one specific office in denver named "Johnson Anxiety Treatment Center"? Do you feel that the specific office page needs to be in the same URL structure as the browse a directory? If so then it would be super long like this: website.com/anxiety-treatment/co/denver/johnson1001 Appreciate your thoughts & reply.
Technical SEO Issues | | ErnieB0 -
URL Path to Store Article Library for SEO
I agree to avoid a domain such as: domain.com/kidney-dialysis-ckd-dialysis/article Your article URL will probably include a similar keyword and it does start to get too long/stuffy. I also agree that you want to setup an internal structure as described: Category > Article Name Where I differ is by including the category in the URL, I believe it is not needed. Instead, allow the URL to be the article name. Then, structure your website so that you have a strong category page for your main keyword phrase and include links to these articles (and vice-versa) as appropriate. Your internal link structure will tell Google just how important the main category page is for the main term and your supporting articles will be organized into categories through your UI & navigation structure. Setting it up this way will inform google how each piece of content is related and still allow for the article to be the main term in the URL structure. However, this is just a preference. You can include the category in the main URL structure and it may even be a big benefit to your site. I prefer the more direct URLs and enforcing the structure through UI & internal link design - I think it allows for more flexibility and attention on the article's terms.
Technical SEO Issues | | Ray-pp0 -
NoIndex Purchase Page
In most carts they aren't an actual add to cart per se. They are link to the product that uses ajax or a query string to add the product to the cart. So basically if people enter in through that link they will have the product automatically added to their cart. I think most people see that as bad e-commerce practice. Plus it really would make it hard to do any evaluation on abandonment.
Intermediate & Advanced SEO | | LesleyPaone0 -
How to Best Target Two Keyword Phrases for a Business Directory
Yes, I think it does. We were hoping that we could just modify all the TITLES to: "Dialysis Center & CKD Center Los Angeles" to include the new keyword CKD Center. Or maybe just do: "CKD Dialysis Center Los Angeles" and hope that Google will rank us for both CKD Center Los Angeles and Dialysis Center Los Angeles.
Intermediate & Advanced SEO | | ErnieB0 -
Subdomain Removal in Robots.txt with Conditional Logic??
Here's how I dealt with a similar situation in the past. Robots.txt on each of the dev subdomains and on the live domain. Dev subdomains robots.txt excluded the entire subdomain, and subdomains were verified in GWT and removed as needed. Made live subdomain robots.txt read-only so it didn't get overwritten. Should have made dev subdomains robots.txt read-only as well, since they sometimes got refreshed with the live content (there was a UGC database that would occasionally get copied to a dev subdomain, and we'd have robots.txt get copied over too and dev subdomain indexed). Set up a code monitor that checks the contents of all of the robots.txt daily and sends me an email if anything is changed. Not perfect, but I was at least able to catch changes soon after they happened, and prevented a few changes.
Technical SEO Issues | | KeriMorgret0