Category: Technical SEO Issues
Discuss site health, structure, and other technical SEO issues.
-
Can these Yoast SEO integration issued be solved?
We are just starting several new sites. We chose Yoast for SEO. It has taken our sites up for 8 months from almost no traffic to 5-10 times the traffic in about a month. If that continues it will be grand. The version we are using is new and its features rich for a non-expert. For instance, Google+ is easy to implement putting my pic up on Google SERPs. It provides an effective and consistent SEO interface for all pages, posts, categories and tags if you are using them with an SEO status red yellow green light that can be sorted - and easy to use SEO page tools integrated. I like it. Now, if I had lots of custom programing as it sounds you do, you may want to gradually integrate the product or just use the basics. Yoast's model is simple to operate, but his approach is fairly comprehensive for such a complex "black art" as SEO. I think Google will respect his work in the end as an honorable approach to SEO for the average WP site. No doubt it is not a panacea, nor right for everyone, but it seems his formula is sound and easy to implement and I can say so far it is working for me. www.apalytics.com corp site www.cio-tech.com executive blog www.synsynack.com network analysis blog I feel that it is well worth the effort to invest in the Yoast model. Bill Alderson
| Packetman0070 -
Wordpress question
Correct often it does show as 100 once you have some links etc -but keep in mind - Wordpress is a strong domain but your subdomain gets treated like its own domain so you dont magically get a DA of 100 and rank for competitive terms your PA and backlink profile need to get built up as well
| DavidKonigsberg0 -
Backlinks go to "example.com" our homepage is "example.com/default.html" am I losing internal link power?
Depending on how example.com is redirected to example.com/default.html you may be losing some "power" through the redirect. A easy fix would be to simply make your example.com/default.html act as though its example.com through .htaccess
| CaseyKluver0 -
Long Domain Name - Subpage URL Question
I mean i have a long domain in the deregulated energy market, but it's an older domain and i have it performing well. But as anything, everything can be better. So if I'm treading down a dark path here, let me know. I always operate under the premise everything can be better and nothing is non negotiable. Clearly I'm posting here for a reason
| tgr0ss0 -
Canonical Link Quesiton
Makes sense to me. You just need to make sure that you can handle talk about all these things and keep up with the publishing schedule. Stay focused on your most important categories.
| CleverPhD0 -
Site Got Hacked! Need Help!
that sucks, hackers are criminals if it's the entire site you could request root domain removal in WMT and they will deindex the entire site then it's simple to get the site indexed again, submit URL to google, link from prominent places, resubmit new sitemap to google etc. This ensures the old adult junk will go away instantly and the new stuff will appear. what are your rankings like, if they are good and you are afraid of losing them then just be a little more patient, don't request the site to be deindexed and work on getting the crawlers coming back. check your robots.txt to make sure they didn't hijack that file! they could be blocking Google from crawling in robots.txt
| irvingw0 -
Ways of Helping Reducing Duplicate Content.
You can also setup Google Webmaster Tools as they give you a report as well to show duplicate titles and metatags. We use both GWT and SEOMoz in addition to other spider software.
| CleverPhD0 -
Dynamic page
I think this is an answer that goes beyond Google. We use rewrites extensively and do not have any problems. There are some caveats Regarding GoogleBot missing information, you just need to make sure that the new URL has all the info. Lets say you are a plumbing portal and use /locator/find?radius=60&zip=&state=FL rewrite to /plumbers/florida-fl/miami/33110/ Your search radius can be a default value vs having to put it in as a parameter. It helps with site structure to think of things as how they would be as a static directory. In this case, you are actually giving more information to GoogleBot with the rewritten URL vs the old one as you have included who you are searching for (a plumber) the city (miami), state (fl) and zip code (33110). The previous URL only indicated the state. If you dont like using all the folders, you can simply have a longer file name with dashes in between the words. If you use rewrites, make sure Google is not spidering the original URLs otherwise you get penalized for duplicate errors. Monitoring Webmaster Tools or using spider software will help you find the holes. You can then use things like Canonical Links and Noindex Tags to get the old URLs out of the index and make sure Google has the correct pages. This all depends on how you implement your rewrites. If you take some time to look at how you want to organized your site to start with then the first two items will take care of themselves usually. A good exercise is to write down how all of this would work within a breadcrumb navigation. This forces you to get organized and also helps you setup how you want all your pages to be shown to Google. If you do start to add parameters on top of this basic structure like pagination or other sortable options, you need to think how you would noindex, follow those pages to make sure that your main page would rank for a given key phrase vs all the other sorted versions of the same page. One thing that is overlooked in setting up this kind of structure is that you can use it to your advantage in your analytic tools to look at global trends to your site. This could be in any site. Using the example above, all US states are at the 2nd level directory, cities are 3rd and zip is 4th. Makes it really super easy to use a regexp on urls to group them. For example, you could setup a filter in you analytics to easily combine all sessions that looked at pages in Florida and wanted to see what the next action was. Cheers!
| CleverPhD0 -
Right redirect to transfer juice www, no-www and website movement
Yes if you just want to redirect individual pages like that - you could have it so any requests on the domain www and none redirect to the new domain and then specific pages can point to specific pages - see below: RewriteEngine on RewriteCond %{HTTP_HOST} ^olddomain.co.uk$ [OR] RewriteCond %{HTTP_HOST} ^www.olddomain.co.uk$ RewriteRule ^/?$ "http://www.newdomain.co.uk" [R=301,L] RewriteRule ^page2.html$ "http://www.newdomain.com/page2.html" [R=301,L]
| Matt-Williamson0 -
E-commerce site multi language product urls
Hi Asaf, If the core content is going to be unique (i.e the product descriptions), then you wouldn't need to worry about duplication issues, and I would definitely recommend unique English url's to match the content. From both a customer and a search engine point of view, it makes sense to have the url's in English, and to a unique destination that can be indexed. If there are large amounts of duplication, you can always use the canonical tag, although it doesn't sound like it would be necessary in this scenario. Cheers David
| mrdavidingram0 -
Question about Robot.txt
Thank you for your reply. This surely helps. I will probably edit the htaccess.
| paumer800 -
The impact of mulstisite wordpress on seo
As long as they are in fact, two different websites, with different content on each site, there's no concern for any duplicate penalties. -Dan
| evolvingSEO0 -
Client's site dropped completely for all keywords, but not brand name - not manual penalty... help!
(rolling eyes) I need a big "I heart google" shirt... so we filed a reconsideration request the week that this happened, then got a reply that it wasn't a manual penalty and that we should look at the content. so we filed another one, with a very in-depth explanation that we hadn't changed much content, we weren't violating any Google guidelines, and we hadn't done anything to trigger an algorithmic penalty... since this started, we didn't do anything on his site - zero changes to content (wanted to wait until we heard back from the 2nd request) - we added a single link from a car dealer directory site (that you have to call to be listed in) the day we realized he had dropped from the SERPs, but nothing since then... the ONLY thing we did was finally sort out his + Local listing - the address was displaying as 916 but is really 915 - but since he didn't know who registered his Places page, we had to re-verify... so there were 2 verified google accounts with access to the Places dashboard, which apparently causes issues and problems... we got that sorted out so that we're the only account with access... and we updated the address... then we got a reply Monday from our 2nd reconsideration request that was word-for-word the same as the first one - no manual penalty, check your content, blah blah blah... and as of Tuesday of this week, BAM - he's back in the SERPs... he's around page 3 or 4 for most of his targeted terms, so the little bit of initial optimization we did has him ranking higher than he was before he signed up with us... so - don't know if the Places/+ Local address being off by a few numbers was the problem (probably not, cause that problem was there before we started) or if the second reconsideration request got someone to look and fix something... or if it was just a really random algorithmic hiccup...
| Greg_Gifford0