James can you elaborate a bit on your answer?
Posts made by JaredMumford
-
RE: Is there a way to track mobile rankings vs desktop rankings in Moz?
-
RE: In need of guidance on keyword targeting
It looks like these guys do a lot more than just concrete repair. For that reason alone, I wouldnt try and fully optimize the home page for just concrete related keywords.
Since the site itself has very low metrics to begin with, you basically have a clean slate. Id optimize all of the services landing pages for their respective keywords (i.e. concrete repair, concrete repairs, concrete repair contractors for the concrete page, though I would not triplicate any one keyword) making sure they had good titles, good amount of unique copy and so on.
Its a natural approach to the taxonomy which always works well for Google placement. It wont get your to rank with on-page alone (except maybe some of the more obscure services) so some inbound links will probably be needed.
Good Luck!
-
RE: Salvaging links from WMT “Crawl Errors” list?
Hi Gregory -
Yes, as Frederico mentions you do not have to put the rewrite cond. before every rewrite since it the htaccess is on your root its implied. You might need to do this if you creating multiple redirects for www to non-www etc.
Also Frederico is right - this isnt the best way to deal with these links, but I use a different solution. First I get a flat file of my inbound links using other tools as well as WMT, and then i run them through a test to ensure that the linking page still exist.
Then I go through the list and just remove the scraper / stats sites like webstatsdomain, alexa etc so that the list is more manageable. Then I decide which links are ok to keep (there's no real quick way to decide, and everyone has their own method). But the only links are "bad" would be ones that may violate Google's Webmaster Guidelines.
Your list should be quite small at this point, unless you had a bunch of links to a page that you subsequently moved or changed its URL. In that case, add the rewrite to htaccess. The remaining list you can simply contact the sites and notify them of the broken link and ask to have it fixed. This is the best case scenario (instead of having it go to a 404 or even a 301 redirect). If its a good link, its worth the effort.
Hope that helps!
-
RE: Disavow Tool - WWW or Not?
To clear up any uncertainty, I think there are two questions being asked:
- Link to be disavowed: Do I disavow both the www and non-www versions of a bad link?
- Site you own: Which site in webmaster tools do I upload the disavow list to - www or non-www?
The link to be disavowed is an easy answer because in most cases if you want a link disavowed, you probably don't want a link from that domain (because its suspect, de-indexed, etc.). Therefore you can simply blanket it with domain:badwebsite.com. This will be sure to get any link from this site to yours, regardless of the subdomain (i.e. www.badwebsite.com, ww2.badwebsite.com, forum.badwebsite.com, etc.)
Answer #2 isn't quite as easy. The safest (and arguably proper) way is to link mine both the www and non-www versions of your website and treat each as a separate site (as Google does). Even if you are using 301 redirects or canonicals I still recommend this method. In many cases, one version will have a much smaller backlink volume. In any case, pick out the bad links and try to get them removed by emailing the website. Once the attempt has been made, Compile the remaining backlinks (still in separate lists for www and non-www), and upload them to their respective disavow tool areas.
-
RE: Google Adwords - trying to understand the figures...
Remember, Google uses LSI in their algorithms - so usually when you see strange discrepancies like this it means that the terms are being treated as semantically related. E.g. - if you search for "ready mix concrete" you'll see both terms (mix / mixed) in bold. Same for forklift truck / hire - you'll see both in bold.
I cant say that I know if this is the reason its just food for thought. I no longer use the Google Keyword Tool to estimate traffic as it can be really off - but what it still does well is measure relative traffic (keyword x has 2.5x the traffic as keyword y, and so on).
-
RE: Caching Problem !
Hi Shubham - I see this domain cached on Dec 29:
http://webcache.googleusercontent.com/search?q=cache%3Awww.glanceseo.com/
Was it a specific page you were inquiring about?
-
RE: Removing URL Parentheses in HTACCESS
I thought I'd come back and re-post the solution in case this shows up in SERPs or anyone other Moz members are looking for this answer (courtesy Noah Wooden of Izoox). HTACCESS:
<ifmodule mod_rewrite.c="">RewriteEngine On
# Strip set of opening-closing parenthesis from URL and 301 redirect.
RewriteCond %{REQUEST_URI} [()]+
RewriteRule ^(.)[(]+([^)])[)]+(.*)$ /$1$2$3 [R=301,L]</ifmodule>Remember to put this in the proper 'order' on your htaccess file if you are doing any other redirecting. The code above 301 redirects URLs with parentheses into the exact same URL minus the parentheses.
-
RE: Removing URL Parentheses in HTACCESS
thanks Merlin - Ill have their programmer try this.
-
RE: Removing URL Parentheses in HTACCESS
Hi Merlin - thank you.
Here is an example:
www.domain-name.com/category1/subcategory/product-name-details-(model-number)-length
needs to change to:
www.domain-name.com/category1/subcategory/product-name-details-model-number-length
Any suggestion would be great. Their programmer is having trouble creating a rule.
Thanks in advance
-
Removing URL Parentheses in HTACCESS
Im reworking a website for a client, and their current URLs have parentheses. I'd like to get rid of these, but individual 301 redirects in htaccess is not practical, since the parentheses are located in many URLs.
Does anyone know an HTACCESS rule that will simply remove URL parantheses as a 301 redirect?
-
RE: Choosing an SEO Company
This answer should be a featured blog post.
-
RE: Canonical Related question
Hi Manish - this is exactly how you use the canonical tag.
Your other option would be to use the rel next / prev but canonical works just as well and is what I use unless various "pages" (page 2, 3, 4 etc) are actually also ranking.
Cheers!
-
RE: Choosing an SEO Company
I agree with all of the above. The reality is that SEO's now get the 'mechanic' rap - you don't know what they're doing under the hood and they charge an arm and a leg.
I think first and foremost you have to look at the client portfolio, and do reference checks. Unfortunately, companies who provide these are never the cheapest (for good reason) but you get piece of mind.
Some SEO companies claim that they cant provide references for their client's privacy. This is true in some cases, but any SEO company that has been given permission to display their client's logo should have permission to give a reference for at least 1, if not 50%.
Speak to the reference - and check the rankings they received. Its a sure fire way to know if the company you are considering is worth their salt.
A good question and one that many businesses don't ask and pay the price. I've heard a thousand times, and almost every client we get has trust issues because they've already been burned.
Good luck!
-
RE: Local SEO-How to handle multiple business at same address
FYI: the unique suite # was requested simply so that the address was 'unique' from google's perspective. assuming that google understands suite #'s. The mail was always forwarded to the proper business even with the duplicate suite # since this was a part of the virtual office service.
-
RE: Landing Page URL Structure
From a strict A/B standpoint where the two variables are:
www.domain.com/state/california
You will see no discernible difference in SEO results. That being said, if you plan to expand the URLs later, i.e.
www.domain.com/state/california/santa-monica/our-service-keywords.* then you should probably consider what length they will be and factor that into your decision.
-
RE: Local SEO-How to handle multiple business at same address
Thumbs up to Miriam who gives a lot of good advice here. And definitely merging is the worst case scenario. Since this discussion is still going, and since it seems your client cant simply go get a real business address (and the overhead that comes with it) Ill give a real case study as an example.
We had a client that was in a competitive niche, and provided a service (not a product). She had a virtual address, and by that I mean they paid a monthly fee to the location and as part of that fee the location would forward any inbound mail, and the client could also use meeting rooms, offices or boardrooms a certain amount of times per month. Other services were available such as phone answering etc...
When she became a client, the first thing we realized was that the client had the same suite number as all the other businesses that used the same 'virtual office' service. The clients previous SEO had already started citation work, but didnt warn them about merging or any other problems associated.
Anyway, so what we did was first request a different unique suite number from the service, which they provided at no extra cost. Then, we bought a local number and forwarded it to her home, which was a local transfer since she indeed was in that city but worked from home unless she needed to meet clients etc.
So now we had a unique local address with a unique local phone number. The last thing we had to do was simply mine for old citations and have them all changed.
This worked, and still does work, but we only did this after explaining to the client that it was not the best scenario for sustainability in local SEO. As per my first comment, at anytime Google could simply omit that address and all business that claim it as a brick and mortar address.
Best of luck !
-
RE: Local SEO-How to handle multiple business at same address
I agree with bjgomer13 in saying that ranking locally with shared addresses does still work. The only caveat is that Local is always changing, and its hard to know if this is something google will target - not because of clients like yours, but because of business that abuse it. Just as they did with PO boxes earlier. As always, the advice you get is confusing because no one can 'predict' whats going to change, and this was a pretty shaky year for algo changes so everyone is being careful.
This response probably just confuses things more!
-
RE: Local SEO-How to handle multiple business at same address
This is a tough one. The worst thing to do is to start with citations and then find out they arent working, and then fix the issue and start all over again (now having two NAPs floating around.
Some businesses use virtual offices in an attempt to rank in different cities or areas. If this is the case with your client, it never hurts to contact that service and explain that you must have a unique suite number. Ive found in some cases they will be quite accommodating.
As for the effects - Ive performed local optimization for clients in this scenario and it still worked fine (and the other businesses using the same virtual office also were in the maps for their keywords), but with constant changes in Local, its risky (in my opinion) to continue without getting a unique address first.
Just my 2 cents!