Hi Jenny,
I don't think these tags would have any affect on the SEO. They are formatting tags the same as _or _that only affect the styling of the text they surround and it is unlikely that when crawled they are treated as anything but that. __
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Hi Jenny,
I don't think these tags would have any affect on the SEO. They are formatting tags the same as _or _that only affect the styling of the text they surround and it is unlikely that when crawled they are treated as anything but that. __
Hello,
Yes sounds like you need to redirect to the https: version. One thing to keep in mind though is to also set your preferred domain once the change is complete. Is the new site going to be https://www.domain.com or https://domain.com? Make sure the preferred domain matches this along with any redirects that are setup. It's important to be consistent.
Doesn't sound like GWT is false reporting. May want to check your trailing slash URL rewrite. It seems like there is an issue there as what you are describing sounds like the URLs are being written incorrectly and causing the incorrect URLs to be generated and show up in GWT.
Your 301 looks ok and if the dev site was spidered and indexed, you should just add the site to GWT and then use the URL removal tool to remove the site from the index, then remove the site and redirect.
Appreciate the different responses. Sounds like not much to do but keep an eye on it and see where it goes. May have to disavow a bunch of links, but if that is the worst of it, then not too bad. Thanks!
Hey Everyone,
We are getting ready to engage a client for some potential marketing/SEO so in preparing for this have ran the site through OpenSiteExplorer. The site is relatively new and there are only two links under the inbound links section. They are relevant and add value, no issues there.
Here is where it get strange. When I look under the 'Just Discovered' section there are many (hundreds) new links going back about a month. Virtually all of them have the anchor text 'Louis Vuitton outlet'. Now the client swears he has not engaged anyone for black hat SEO, so wondering who could possibly be creating these links. They do sell some Louis Vuitton items on the site, so I'm wondering if it is possible that some spam bot has picked up the site and began to spam the web with links to the clients site. So far today, 50 or so new links have been created with said anchor text and the clients root URL all on very poor quality, some foreign blog sites.
Would like to find out why this is happening and put a stop to it for obvious reasons. Has anyone experienced something similar? Could this be a bot? Or maybe someone with an axe to grind against the client? Anyone could be doing this on their own, but just seems strange for it to be happening to a new site that does not even rank highly at the moment. Any advice or info is greatly appreciated, thanks in advance.
Hey Jamie,
Thanks for the response. Yes I agree that the eye test is usually a good barometer for determining the quality of a site and good to hear this applies to directories as well. I'll check out the post, pretty sure I can find it, thanks for the heads up. 
Hello All,
Just started to work on a new clients site that has been hit with multiple google penalties. I was looking at their backlink profile and noticed they have numerous links from what seem to be very low quality directory websites. My question is, when building citations and looking for directories to submit to, what makes one directory more credible then another one? If most of them are just publishing links and business information, why does google consider one credible and the other spammy? Clearly with some it's easy to tell if they are credible or not, but with some it is not as easy. Should you only really be submitting to the best of the best or are some lower lever ones ok too? Have read a few things on this topic, but most is older and just want to hear what people have to say on this today. Thanks.
Have been doing SEO for a few years now and have to say that Yoast SEO Plugin does great things to get your site setup and well positioned for SEO, at least in terms on on-page optimization.
http://yoast.com/wordpress/seo/
It gives you a lot of flexibility for managing the SEO on different pages and posts. Sets up sitemaps, open graph tags, will give you recommendations on content in relation to targeted keywords, and helps you setup Google Authorship, just to name a few things. Another thing I like about it, is that it is not overly bloated. It just provides what you need and little else that you don't. You can set all this up yourself, certainly, but the amount of time you save using this plugin is huge and I'm all for that. Would not setup a wordpress build without this plugin.
Hey Remus,
While it doesn't coincide with the exact time that the last algorithm hit, it could be that it just took time to go through the web. I found another post mentioning something on September 17th (http://moz.com/community/q/what-happened-on-september-17-on-google) and this is the exact day we saw the dramatic loss in organic traffic, so it seems that it's possible something has happened on that day if this client is not the only one to run into issues. Also I would kind of seem to go along with what was mentioned as being changed in the new update, in that google is interpreting the results now.
Had something similar happen to a client of ours. On Sept 17 they lost about 85-90% of their organic traffic among all search engines. I mentioned this in a post I added yesterday.
http://moz.com/community/q/loss-of-85-90-of-organic-traffic-within-the-last-2-weeks
Still trying to figure out exactly what happened, but am also curious to see if anyone else ran into similar issues.
copyright infringement
downloads / malware
hidden text
duplicate content
noindex applied by dumb designer
bad robots.txt
Thanks for the suggestions, anything else worth checking?
So, just from doing a bit more investigation, I read Google rolled out a new algorithm update a couple weeks ago, (around the same time the traffic dropped) Google Hummingbird, and my initial impressions of this are that, Google is now trying to interpret your search phrases as opposed to just reading them and returning matching results. This clients site is built around some wording that could be interrupted as slang. So it looks like instead of returning the clients site when these keywords are typed in, google is now returning slang term results, since my guess is that more often then not people were looking for the slang term not the clients site, resulting in the massive traffic drop. Has anyone seen anything similar?
Hey Everybody,
Have a client that recently came to us asking for SEO help. Did some initial analysis on their current SEO status and most everything looked pretty good. On-page work was pretty good, nothing really lacking there other then missing alt tags for all images. Their linking profile looked good too. Lots of good links from quality sources, all relevant. Client has done some good press releases. They could probably use a bit more focus in their content as it is somewhat general and not keyword focused. Initially it didn't look like they needed any help with their SEO, so was a bit curious as to why they contacted us.
Today we get their google analytics information and immediately noticed that they have had a 85-90 percent drop in organic traffic from all major search engines that started about two weeks ago. If all their SEO looks to be done properly, any ideas what would account for the massive drop in traffic? The only thing that looks like may have happened is that they may have dropped a couple spots from position #1 to position 2-3 for some of their highest traffic terms. Even if that is the case, I would not expect such a high drop off in terms of organic traffic.
Just curious as to what anyone else can attribute the huge drop in traffic to or what else may help identify the issue. It's almost as if analytics was turned off or removed from the site, but that is not the case.
Hello,
Easiest way I think is to create a .htaccess file in the root of domain 1 and add this text.
RewriteEngine On
RewriteBase /
redirect 301 /pages/some-page-here http://domain2.com
This will redirect the user on http://www.domain1.com/pages/some-page-here to http://www.domain2.com
Hello,
Yes, if your pages are indexed and you want to change the URLs then a 301 redirect is your best bet. Edit the URL on the page or post you are looking to edit. Then add the redirect to the .htaccess file. You may need to create one to add to the root directory of your wordpress install. The general .htaccess file looks like this.
RewriteEngine On
RewriteBase /
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]
Just add in the redirect to the end of this file. It could look something like..
redirect 301 /old-page http://www.something.com/new-page
Then you should be all set.
Hello,
What I have done in the past, is that after some fixes have been put in place, I will delete those error messages in I assume your Webmaster Tools account, and then see if they return. If they don't, then looks like your issue has been solved. If they do, then you may need to reevaluate and see if another fix would help correct your error.
Hello,
I don't think you need to create the same site 4 times. A better approach could be to make one site and then make separate landing pages that have slightly different targets. So for example you could target 'queens web design' on one page and then on a second one go after a variation of the keyword like 'Long Island Web Development'. You also don't need to link these pages to the main navigation, but make sure they are contained as pages on the site. This way the pages will still rank but won't be as confusing to potential customers browsing the site, since they are entering through the targeted landing page and then continuing to browse the rest of the site, they wont directly see different location pages on the site. It would probably be beneficial to see which location has the most/best search volume as well and then target that location a bit heavier with the rest of the website copy.
I've been trying to block out a few URLs using robots.txt, but I can't seem to get the specific one I'm trying to block. Here is an example.
I'm trying to block
but not block
It seems if it setup my robots.txt as so..
Disallow: /cats
It's blocking both urls. When I crawl the site with screaming flog, that Disallow is causing both urls to be blocked. How can I set up my robots.txt to specifically block /cats? I thought it was by doing it the way I was, but that doesn't seem to solve it.
Any help is much appreciated, thanks in advance.
Appreciate the response. Will test this on a few pages and see if it leads to better results. Thanks.
Hi,
We have a client who has engaged us recently for some SEO work and most of their website looks pretty good seo-wise already. Many of their site pages rank at the top or middle of page two for their targeted keywords. In many cases they are not using the targeted keyword in the URL and most pages could use some additional on-page clean up. My question is, is it worth it to re-write the URLs to include the targeted keyword and then do 301 redirects to send the old pages to the new ones in order to improve the ranking? Or should we just do the minor on page work in hopes that this will be enough to improve the rankings and push them on to the first page. Thanks.