If both cases display the same exact content via different URLs, then you don't want to return 200 status codes for both versions. A 301 redirect might work but since they are for the same exact page, it's possible you might run into an infinite redirect loop and make your pages completely inaccessible. As a result, I would recommend using the rel="canonical" tag to designate which URL is the correct one for the search engines to index.
Best posts made by StreamlineMetrics
-
RE: What are the SEO consequences of using a 200 instead of a 301 redirect?
-
RE: Blog on 2 domains (.org/.com), Canonical to Solve?
Yes, the canonical tag is the best route to take in this scenario.
-
RE: Using the same Google Analytics account for two sites
I'm not sure if I understand this correctly but I'll give it a shot. You say the competition is using the same GA account on two sites, but how does your friend know what the competition's traffic numbers are? Does he have access to their Google Analytics account? And where does the competition "show its high traffic"?
Google Analytics simply tracks how many visits are coming to a website(s) and it doesn't have anything to do with impacting any actual traffic. One way to see if the competition is fudging its traffic numbers by saying all of the GA visits are for one site only when the traffic is really for two websites, then tell them to provide a GA report showing that all of the visitors are using the same Hostname.
You can see the Hostname by going into Google Analytics and clicking on Custom Reporting. Choose "Unique Visitors" as the metric and "Hostname" as the dimension.
-
RE: How to improve data analysis that come from the Analytics ?
Hi there,
We had a client with the same issue. They could use Google Analytics to track which traffic sources generated leads, but there was no way to track which lead came from which traffic source.
So we created a tool called Convertable (http://convertable.com) that does just this. It emails you the lead information with full analytics data attached to it. Then you can update the status of the lead in the online dashboard so you can enter in how much revenue, how many shirts, etc. were generated by each individual lead. This will enable you to tie the amount of revenue generated back to the original traffic source in order to determine the true ROI of online marketing campaigns. Convertable is still in free beta mode but we will soon be launching premium plans with more features. Feel free to check it out and let me know if you have any questions or suggestions for how to improve it!
- Patrick (co-founder of Convertable)
-
RE: Repeat name and location in URL or no ?
I would recommend not repeating the name and location in the URL and instead use something like www.stjeromemitsubishi.ca/partsandservice/contact.aspx. There won't be any SEO benefit by simply repeating the name and location in the contact page URL.
If you had multiple locations, then I could see the point in doing something like www.stjeromemitsubishi.ca/partsandservice/seattle-dealership-contact.aspx
(ideally I would suggest structuring the URLs like this www.stjeromemitsubishi.ca/locations/seattle/contact.aspx but I am following your original example)
-
RE: URL structure: 301 redirect or leave as is?
I would recommend optimizing the URL structure and 301ing the old URLs to the new ones. The amount of link equity lost (which is quite minimal) by 301ing the ugly URLs to the new URLs will be outweighed by the benefits of using a better URL structure in the long term.
-
RE: Are you an in-house SEO or an Agency/freelancer SEO ?
Agency for client work while running some of my own personal projects on the side.
-
RE: Domain changed 5 months ago still see search results on old domain
I know you said you submitted a change of address to Google in Google Webmaster Tools, but did you add and verify coed.com as the specified domain name for both coedmagazine.com and coed.com? This is highlighted as an Important step on https://support.google.com/webmasters/answer/83106?hl=en.
Assuming you do have both domain names verified in GWT, one suggestion would be to take a look at the Crawl Errors/Index Status for both domain names to see if there are any issues with Google not following the 301s correctly or somehow still accessing the site through the old domain name.
Also, make sure to check the robots.txt file within GWT to see if Google has any problems with it (I checked coed.com/robots.txt through http://tool.motoricerca.info/robots-checker.phtml and encountered some errors, but they may be too minor to affect Google).
-
RE: How can i improve alexa rank of my website
I would normally say "Alexa doesn't matter so don't worry about it" but it turns out a lot of advertisers and domain name buyers do consider it when determining the value of a site so there are definitely some benefits to having a decent Alexa ranking.
Anyways, to answer your question, in order to improve your Alexa ranking (without blatantly cheating), you basically need to increase the number of visitors to your website that have the Alexa toolbar installed. Here are some suggestions -
1. Most anti-spyware programs nowadays flag the Alexa toolbar as malware and remove it from the computer, so this means most Alexa toolbar users are not as computer savvy and don't even know how they got the toolbar installed in the first place, let alone how to remove it. Take this into consideration when creating content/targeting keywords specifically for this kind of audience (ie, likely the older generation).
2. Do keyword research to see what kind of searches are being made for "Alexa" related keywords such as "Alexa toolbar uninstall" or "remove Alexa toolbar." This is a dead giveaway that these users have the toolbar installed and are looking to remove it. You can get them to your site by ranking organically for those phrases or if you want these visitors quickly, start up an AdWords campaign and bid on "Alexa toolbar removal" related keywords. Sit back and watch your Alexa ranking take off.
3. My final suggestion is my least favorite since it's essentially peddling spyware to your user base but it's still effective. Offer your users the option to download and install the Alexa toolbar from your website in exchange for some kind of reward. Again, I wouldn't suggest going this route since you risk the chance that your users will abandon your site forever but nonetheless it's still an option.
-
RE: Tool Request - What keywords does a site rank for?
SEMRush.com is exactly the tool you are looking for. One of my favorites.
-
RE: Regarding the META KEYWORDS tag
1. You won't be penalized, they are just ignored as a ranking factor and thus it's just recommended not to waste your time adding them in the first place. Another reason to remove a Meta Keywords tag is that it provides an easy way for your competition to quickly see the keywords you are trying to target (any competent SEO could likely figure this out by analyzing other elements on your site but don't make their job any easier!)
2. So your site currently has the Meta Keywords tag but you are trying to remove them, correct? If you are using Magento, you should be able to remove them either through the dashboard or by manually editing the template files.
-
RE: I want to remove some pages from my site with PR, what should I do with traffic?
If you don't have similar pages to redirect to, I would do one of the following -
1. Keep the pages up but add some text that says "This is an archive so some of this content may be outdated" and then possibly link to other authoritative sites which do have updated content. This way your pages will likely still rank well in the search engines and the users will find what they are looking for, either in your archive or on the external sites.
2. Simply 301 redirect these pages to the homepage to preserve any link juice/link equity. You will likely see a drop in organic traffic for any terms these pages are currently receiving but it's a much better option than simply removing the pages and serving 404 errors, which is bad for both search engines and users alike.
-
RE: How is Google crawling and indexing this directory listing?
There's numerous ways Google could have found those pages and added them to the index, but there's really no way to determine exactly what caused it in the first place. All it takes is for one visit by Google for a page to be crawled and indexed.
If you don't want these pages indexed, then blocking those directories/pages in robots.txt would not be the solution because you would prevent Google from accessing those pages at all going forward. But the problem is that these pages are already in Google's index and by simply using the robots.txt file, you are just telling Google not to visit those pages from now on and thus your pages will remain in the index. A better solution would be to add the no-index, no-cache tags to those pages so the next time Google accesses those pages, they will know to remove those pages from the index.
And now that I've read through your post again, I am now realizing you are talking about file directories rather than normal webpages. What I've wrote above mainly still applies, but I think the quick and easy fix would be to turn off Directory Indexes all together (unless you need them for some reason?). All you have to do is add the following code to your .htaccess file -
Options -Indexes
This will turn off these directory listings so users/search engines can't access them and they should eventually fall out of the Google index.
-
RE: Why is google not deindexing pages with the meta noindex tag?
Do you know when you added the noindex tags? Google will need to recrawl the pages to see the noindex tags before removing them. I just looked at one your category pages and it looks like it was cached by Google on December 1st, and there was no noindex tag on that page. Depending on how big your site is and how often your site is crawled will determine when they will be removed from the index. Here's Google's official explanation -
"When we see the noindex meta tag on a page, Google will completely drop the page from our search results, even if other pages link to it. Other search engines, however, may interpret this directive differently. As a result, a link to the page can still appear in their search results.
Note that because we have to crawl your page in order to see the noindex meta tag, there's a small chance that Googlebot won't see and respect the noindex meta tag. If your page is still appearing in results, it's probably because we haven't crawled your site since you added the tag. (Also, if you've used your robots.txt file to block this page, we won't be able to see the tag either.)
If the content is currently in our index, we will remove it after the next time we crawl it. To expedite removal, use the URL removal request tool in Google Webmaster Tools."
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=93710
-
RE: Best way to analyze keyword difficulty for 10,000+ keywords?
I would also love to find an accurate way to instantly measure competition for a bulk list of keywords. Unfortunately, I think it's the inherent nature of this request that the tool is either going to be quick and inaccurate/outdated OR accurate and slow because keyword difficulty is determined by organic competition which requires querying a search engine's organic listings for each keyword to grab a list of URLs, then analyzing the on-page factors, then analyzing inbound links for each of those URLs, and then comparing all of these against each other to determine how competitive each phrase is.
And considering most users want to find out difficult a keyword is to rank for in Google then that means most tools will need to query Google's rankings/API in real-time for every single keyword to get the most accurate results, but that also means this process is going to take some time, especially since Google actively tries to prevent tools from automatically grabbing/scraping their results (as Raven, WordStream and SEOmoz all found out just recently).
Anyways, I realize I'm not answering your question so I want to offer some suggestions, even if they are not exactly perfect -
1. Long Tail Pro - My favorite keyword research tool as of late, and they actually just released a new feature which allows you to import keyword lists up to 10,000 of your own keywords to check competition. However, since it queries Google, it's not going to be instant... however it does seem to be more robust and faster than my next suggestion.
2. Market Samurai - great for keyword research/competitive analysis but it can be somewhat slow. Their newest update features more competition metrics which look pretty useful.
-
RE: Is it better to delete old job pages on a recruitment site?
I'd recommend re-purposing the pages with expired listings. I imagine a lot of those pages rank pretty well for longtail phrases (for example something like "part-time dump truck driving job in rome, georgia") which are highly targeted and likely have minimal competition. Even if the job opening is no longer available, there is still value in keeping that page alive for the sake of ranking in the search engines. A couple of suggestions -
1. Set up a contact form/email opt-in for users to fill out to "let me know when jobs like this become available." This enables you to capture useful contact information from your users which you can use as you see fit.
2. Provide links to similar job listings that are available. For example, put on the page "this job listing has expired but check out other jobs in Rome, Georgia or dump truck driving jobs within 50 miles", etc. This way you keep the user on your site instead of them going to a competitor's site.
Otherwise, if you decide to delete a lot of these pages then you'll lose any existing rankings/traffic as a result.
-
RE: Site removed from Google Index
Since the "important pages" have been removed "by request" leads me to believe somebody with access to your GWT has manually requested to remove the site from Google via the URL removal tool. You should be able to see who has access to your GWT and who removed the URLs within GWT. Just go to "Google Index" then "URL Removal" then click on the select box to the right to see the URLs that have been removed by others. You should be able to resubmit the site/URLs to Google afterwards.
-
RE: On-Page SEO Fixes - Are They Relative?
I would suggest running the report card for each individual page for that page's targeted keywords rather than testing the homepage for all of your site's keywords.
-
RE: Gifly.com: url structure for new site with only one kind of page
I would suggest setting the canonical tag on each of the GIF pages. So http://gifly.com/FoYi/ and http://gifly.com/FoYi/#blonde would not be considered duplicate content when Google encounters those URLs because Google knows that http://gifly.com/FoYi/ is the designated canonical URL that should be indexed regardless of any hashtags that happen to be appended to it.
I would also recommend setting up the category/hashtag pages as their own standalone pages. For example, all of the GIFs related to #blonde would be located on http://gifly.com/blonde/. Then once you click through to one of the individual GIF pages, they will link to http://gifly.com/FoYi/ or whatever the URL is. This way you can optimize the category pages for SEO, such as "Dumb Blonde GIFs" and put some original text on the page in addition to just the links to the individual GIFs.
-
RE: Guys & Gals anyone know if urllist.txt is still used?
I would just use a sitemap.xml file instead for Google, Bing and Yahoo. Then you can submit the sitemap.xml file within the Google Webmaster Tools and Bing Webmaster Tools (includes Yahoo). You can easily create an XML sitemap at http://www.xml-sitemaps.com/