How about a temporary un-capping of the limit? You are making your product harder to use and it's affecting business. Fine if you want to charge more, but can we have it the way it was before you make (announce
a change?
Posts made by 540SEO
-
RE: KW Difficulty Daily Limit |
-
RE: KW Difficulty Daily Limit |
I think a 400 limit per day is unreasonable, especially given the amount of the subscription fee for Moz. Usually when doing KW research, I have 10 categories or so of KW's, each with 50-75 terms. I don't use the tool daily, but when I do, getting hung up by a term limit puts my deadlines at risk.
Why was there no notification of this change to your paying users?
-
Client wants to distribute web content to dealers - iFrame?
I have a client who sells a product through a network of nationwide dealers. He wants to provide update-able content to these dealers so they can create sections on their websites dedicated to the product. For ex., www.dealer.com/product_XYZ. The client is thinking he'd like to provide an iframe solution to the dealers, so he can independently update the content that appears on their sites.
I know iFrames are old, but are there any SEO concerns I should know about? Another option is to distribute content via HTML that has a rel=canonical command as part of the code, but then he loses the ability to centrally update all the distributed content.
Are there other solutions he should consider?
Thanks --
-
RE: Express Update USA not available for SEO's
Totally agree the manual updating thing isn't scalable or a long-term play. What I haven't gotten comfortable with yet is how the large distributors will adequately handle the clean-up piece. Dupes, incorrect info. from previous listings, etc. The distribution piece is easy now, the clean-up component isn't and that's the piece of Local SEO that makes a huge difference -- a clean Local footprint.
-
RE: PDF on financial site that duplicates ~50% of site content
Thanks. Anybody want to weigh in on where to rel=canonical to? Home page?
-
RE: PDF on financial site that duplicates ~50% of site content
I thought the idea was to put rel=canonical on the duplicated page, to signal that "hey, this page may look like duplicate content, but please refer to this canonical URL"?
Looks like there is a pdf option for rel=canonical, I guess the question is, what page on the site to make canonical?
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=139394
Indicate the canonical version of a URL by responding with the
Link rel="canonical"HTTP header. Addingrel="canonical"to theheadsection of a page is useful for HTML content, but it can't be used for PDFs and other file types indexed by Google Web Search. In these cases you can indicate a canonical URL by responding with theLink rel="canonical"HTTP header, like this (note that to use this option, you'll need to be able to configure your server):Link: <http: www.example.com="" downloads="" white-paper.pdf="">; rel="canonical"</http:>
-
RE: PDF on financial site that duplicates ~50% of site content
Not sure which page I would mark as being canonical, since the pdf contains content from several different pages on the site. I don't think it's possible to assign different rel=canonical tags to separate portions of a pdf, is it?
-
PDF on financial site that duplicates ~50% of site content
I have a financial advisor client who has a downloadable PDF on his site that contains about 9 pages of good info. Problem is much of the content can also be found on individual pages of his site.
Is it best to noindex/follow the pdf? It would be great to let the few pages of original content be crawlable, but I'm concerned about the duplicate content aspect.
Thanks --
-
RE: Local business with multiple sites
Related question: if location #2 is brought under site 1 as a location page, what's best practice as far as putting the address in the footer sitewide? Put location 1 address in the footer everywhere but the location 2 page(s)? Avoid altogether?
Thanks --
-
RE: Local business with multiple sites
Hi -- thanks for your help. Here is more info. in response to your answer:
-I have picked up duplicate content problems and will be working with the client to fix this
-The locations are in the same metro area (good to know that separate states can be a good reason to keep separate sites)
-The lost rankings/Places shake-up is a bit concerning. Site 1 is well-established and has domain authority of 38 and home page authority of 48 (this is the site I'd likely move everything to). Location 2's site is 2.5 years old but has 23 domain authority and 36 page authority. Site 3 is an online store for spa products and very new (not yet launched).
For queries that trigger a Places result, location #1 outranks location #2 in every instance I can find. Having location #2 disappear for a while wouldn't be great, but from what I see the location #1 site ranks really well Organically (and there is a prominent link to location #2 on the home page) so we may be OK.
Also, there are a few queries where local results are not triggered, and the location #2's site ranks high. I'm not worried about the Organic ranking scenario in this case because a 301 redirect should largely preserve the position, correct?
In any case, I think the benefits outweigh the costs and consolidated ranking power potential. I'll keep my fingers crossed that the Places shake-up will be short-lived and advise the client accordingly.
Thanks -- let me know if I missed anything.
-
RE: Local business with multiple sites
Hi -- there are no subdomains in the equation here. I would be moving to subfolder pages (not subdomains), and the current domains are separate domain names altogether.
Thanks --
-
RE: Youtube dofollow link to web site
YouTube links in the video and Channel to your site are all nofollow. I'm not sure if they were ever dofollow, but regardless, it's not a bad link to get to round out a natural looking link profile (plus the links can generate traffic). If your video has a good chance of getting shared outside of YouTube, make sure to put your domain name and/or brand in the video itself so you get attribution and traffic.
-
Local business with multiple sites
I'm auditing a local business' sites (a spa) and I wanted to run my recommendations by everyone.
There are 3 sites:
www.sitename1.com -- main store location, used for Google Places listing #1 www.sitename2.com -- 2nd store location, used for Google Places listing #2 www.sitename3.com -- used for product sales for both locations
Sitename1.com has the most ranking power. I'm going to recommend that they move sitename2.com and sitename3.com to sitename1.com as subfolders, 301 redirecting each page to the corresponding page on sitename1.com/subfolder.
Google Places listing #2 would be changed from www.sitename2.com to www.sitename.com/location2.
Any risks or problems with this strategy anyone can see?
-
RE: Video thumbnail pages with "sort" feature -- tons of duplicate content?
Thanks for the link. That does help with the paginated pages issue.
Anyone have any thoughts on the sort feature and how that will be viewed by the search engines? Are different "sort" results containing some of the same video thumbs/descriptions considered duplicate content? What's the best way to handle that?
-
Video thumbnail pages with "sort" feature -- tons of duplicate content?
A client has 2 separate pages for video thumbnails. One page is "popular videos" with a sort function for over 700 pages of video thumbnails with 10 thumbnails and short desriptions per page. (/videos?sort_by=popularity).
The second page is "latest videos" (/videos?sort_by=latest) with over 7,000 pages.
Both pages have a sort function -- including latest, relevance, popularity, time uploaded, etc. Many of the same video thumbnails appear on both pages.
Also, when you click a thumbnail you get a full video page and these pages appear to get indexed well.
There seem to be duplicate content issues between the "popular" and "latest" pages, as well as within the sort results on each of those pages. (A unique URL is generated everytime you use the sort function i.e. /videos?sort_by=latest&uploaded=this_week).
Before my head explodes, what is the best way to treat this? I was thinking a noindex,follow meta robot on every page of thumbnails since the individual video pages are well indexed, but that seems extreme. Thoughts?
-
RE: Press Release for INFOGRAPHICS
I've done a few IG's and an effective strategy is to develop a hit list of blogs and websites that are relevant to your topic, then contact each one with a brief explanation of the infographic and why it's relevant to their audience. Track responses in a spreadsheet and follow-up with folks you haven't heard from 5-7 days later. Keep emails brief, friendly and professional and craft a compelling headline that isn't spammy.
The most important part of the IG is the embed code. Provide it in the emails you send out so people can easily grab it and post to their site/blog. Also, create a page on your site (make sure you post the IG on your site first so Google credits you with the content) that includes the embed code. When people post using the embed code, the result should be your IG, anchor text link back to your site and a link for "Place this graphic on your site". Also, don't forget to register the work on Creative Commons.
Here's a recent example of how it works: http://540seo.com/are-online-reviews-killing-your-business
-
Robots.txt blocking site or not?
Here is the robots.txt from a client site. Am I reading this right --
that the robots.txt is saying to ignore the entire site, but the
#'s are saying to ignore the robots.txt command?See http://www.robotstxt.org/wc/norobots.html for documentation on how to use the robots.txt file
To ban all spiders from the entire site uncomment the next two lines:
User-Agent: *
Disallow: /