No. Just a new page.
Posts made by Chris.Menke
-
RE: Main keyword not ranking in top 100 - but others are #1
That doesn't seem like it would be the problem. Has that site always been that clean?
How about content quality? Good?
Have you considered just whitewashing the current URL that's been optimized for your main keyword and starting over with a new URL? Even if it is your homepage. Like, just re-optimize that current page with a bland/ non-cannibalizing keyword and start fresh with a new URL, new copy, new images. No 301s, no rel=canonical, no nothing. Could be worth a try. You can always revert back to the originals if nothing shakes free after a few months, right?
-
RE: Main keyword not ranking in top 100 - but others are #1
Over optimization?
That's not out of the question with EMDs. Sometimes that can be hard to get past.
Maybe you got a little spammy with it back in the day? You have the keyword all over your homepage, in most of your external link anchor text, title, alt tags, internal links, etc?
That could be it.
-
RE: How ask Google to de index scrapper sites?
Doug,
Two things:
1. A site shouldn't "go by" two domain names. You should verify the means by which a visitor may be made aware of both domain names to be sure one of the domains is being 301 redirected to the other.
2. If you don't own the domains that contain the scraped content, you can contact the site owner to request that they take it down. Also, here is a link to Google's copyright help center on https://support.google.com/legal/topic/4558877 to deal with it that way.
You may want to step back and get a bigger picture understanding of the inner workings of your website, hosting and google. It will help you in the short term and long term.
-
RE: Which pages should I index or have in my XML sitemap?
I think they should be indexed, but keyword research should shed light on this topic for you. It will let you know if your audience is searching for those things and in what numbers. Even as they are, though, they might make sufficient landing pages for google. You could de-noindex a group of those pages at a time, starting with the ones most likely to be popular and see how google treats them. I think I'd go that route rather than release them into the wild all at once.
To me, the pages with the most interesting potential are the /venues/ pages like /venues/md-concert-venues/a, for example. I think the potential lies in populating them with venue grouping, upcoming artists grouping, and state. How hard would it be to populate an area above the black line with all/some of the upcoming artists playing near the hotels that show on that page. That 3-way cross referencing would make those pages fairly unique on the web and unique on your site and would give google a number of good reasons to send traffic there. They'd probably be good pages to publish advertising on, too.
Also wondering if there is a thing such as "licensing" dedicated pages out to companies/hotels that are putting on non-musical events like conferences, etc, so they can link to a kind of pre-fab hotels-close-by page up for their attendees?
-
RE: Domains vs Subdomains for similar brands.
RA, my opinion--from a visitor perspective, I'm not a fan of the subdomain route.
-
RE: Which pages should I index or have in my XML sitemap?
Mike,
I'm wondering...is that an SEO question? It sounds like a business decision to me. From what you've said, I don't see any reason for Google to ding you on anything. My only questions would be--Is google indexing all the pages you want it to and does not have your noindex pages indexed? Any bad links coming in? Pages are loading at a decent speed? Oh, and I don't see a reason to have your noindex pages in the the sitemap.
Other than that, if those non-performing page are taking up time that you could be spending on more productive pages or on exploring more productive opportunities, then, again, it's time to put on your CEO cap.
-
RE: Multiple sub-category of the same name ? does that effect SEO
LKC,
So long as the content on those individual product and category pages is unique and you are canonicalizing duplicate content, there shouldn't be any problem.
-
RE: Sometimes I write a post for another site and then post the first 1/3rd on my site to help promote it. Is this a bad idea?
Leaving it as is is fine, although you can also properly rel=canonical it to the full post on the other domain if you or the publishing site want you to. I would probably do that.
Technically, it is considered duplicate content but google doesn't actually penalize for "duplicate" content per se. In cases such as yours, google just doesn't assign any ranking strength to the duplicate version--that is not the same as a penalty. So, if it is indexed first on the other site and/or that other site is a stronger site, or there is no intent to make money through monetizing copied content, google will just assign the value for that content to that other site.
https://webmasters.googleblog.com/2013/04/5-common-mistakes-with-relcanonical.html : We recommend the following best practices for using rel=canonical:
One test is to imagine you don’t understand the language of the content—if you placed the duplicate side-by-side with the canonical, does a very large percentage of the words of the duplicate page appear on the canonical page? If you need to speak the language to understand that the pages are similar; for example, if they’re only topically similar but not extremely close in exact words, the canonical designation might be disregarded by search engines.
- A large portion of the duplicate page’s content should be present on the canonical version.
https://support.google.com/webmasters/answer/66359?hl=en Duplicate content on a site is not grounds for action on that site unless it appears that the intent of the duplicate content is to be deceptive and manipulate search engine results. If your site suffers from duplicate content issues, and you don't follow the advice listed above, we do a good job of choosing a version of the content to show in our search results.
“Duplicate content generally refers to substantive blocks of content within or across domains that either completely match other content or are appreciably similar. Mostly, this is not deceptive in origin…..” Google Webmaster Guidelines
-
RE: Setting up analytics for a website redesign
Funny you said it's the IT department's decision. From the sound of this scenario, I seriously almost asked if your web programmers were the ones running the show over there, but then thought better of it. : ) I was pretty close though, huh?
But we are talking 3 domains though---www2, www, and nonwww. All three with the very similar, if not the same content and in this scenario, you can't canonicalize www and nonwww back to www2 because www2 is not in the index. How can ww2 send search traffic to www or nonwww if search doesn't know ww2 even exists?
Is it their new architecture that they're worried about converting or is there new content going in? Because if it's just new content, you could start A/B testing that now. But I'm guessing it's the architecture they're worried about. So maybe they want the glory of creating a brand new site from scratch but they want you to be responsible for making sure it works as they advertised.
If that's the case, I'm not sure if you have a choice but to document the heck out of the current site's performance, conversion-wise as well as search-wise and then just throw the new site up on www and then start tweaking. With solid current-performance documentation you'd have clear metrics to bring the new site back up to and hopefully, surpass.
-
RE: Setting up analytics for a website redesign
hmmm, not sure I get it. The whole 3 domains thing seems very convoluted. Why not just A/B test off the one domain between re-makes of the old landing/conversion pages and the new ones?
-
RE: Is It Possible to Be Punished By Google For Getting Too Many Links?
Too many, as in:
- Relation to site age
- Relation to others in your industry
- Relation incremental velocity
- Relation to the quality of links
Google's seen at least 10s of thousands of websites in your industry and is aware of every site that has ever linked to other sites in your industry. It knows how fast those others have acquired links, at what age they've acquired links, how quickly they've developed links, when and why spikes in link acquisition have occurred, and what percentage of links tend to be low quality.
Think of a link as a hand shake between you and a businessperson in an industry complimentary to yours that you had a good conversation with over lunch at an industry conference. If the relationship between you and the linking site isn't at least as close as that, it's possible it may be counterproductive.
-
RE: Is It Possible to Be Punished By Google For Getting Too Many Links?
"Is it possible for google to punish us for getting too many too fast?"
Absolutely. Maybe you didn't read the sentence after the one you read that told you getting links to your site is good. That following sentence said.. "If you are not familiar with how google's algorithm values links, do not engage in link building and do not purchase links because there is a good chance that all of a sudden your rankings could drop dramatically at some point in the future.
Did you miss that sentence?
-
RE: When block bots using robots.txt vs meta tag "no index"?
1. No. It is not crawled and it is not indexed--unless googlebot finds the url elsewhere or the page was crawled before it was made inaccessible via the login.
2. The purpose of the robots.txt file is to tell bots not to access resources on a site so I don't see how it would be bad in your case.
3. If you block a page via robots.txt (or if it is redirected), google will not get to the noindex meta tag on the page itself. You have to let googlebot crawl the page first, let it see the noindex tag, then block it via robots.txt. https://support.google.com/webmasters/answer/93710?hl=en
4. Still not clear on the redirect chain issue.
-
RE: When block bots using robots.txt vs meta tag "no index"?
Hey Doug,
Your question sounds interesting but it's a bit hard to understand what what you're asking. Think you could rewrite it from scratch below? Be sure to include mention of any redirects you're employing. No need for the link again. Maybe that will shed more light on your issue.
-
RE: What heading tag to use on sidebars and footers
Kerry,
Don't worry about this too much. Putting a lot of thought into heading tags is probably more important to your own development of copy than it is to Google--once you get past the H1 tag, and even that is hardly a noticeable factor. Any well constructed copy is going to have the most important words and concepts at the beginning of the copy and your heading tags just reinforce that.
However, you don't necessarily want to think of lower tier headings as places to put "additional" "keywords". Think of them, maybe, as opportunities to use vocabulary that supports and broadens the concepts of your page's primary terms. I know the web is chock full of landing pages that organize their keywords that way but for the most part, I think that's fruitless for better search results and BORING for the reader.
These subheadings are your opportunity to distinguish your publication from the hoards of others when your audience first skims through your article. Think them through and use them to enhance the reader/user experience first and search engines somewhere after that.
-
RE: My Website disappeared from Google Search Results overnight
Simon,
I've yet to run into someone who is happy with google's reason for their reduced traffic. : ) At least you got a reply and something to work with going forward.
You have a 1000 word review on each of over 1000 a pairs of running shoes. I gotta wonder how unique each of those reviews are. Sure, the order of the words may be different but how what about the variety of words on each page. How authoritative are those reviews and are they of caliber of an expert review by someone who's really put some time into running in them and then gave a thoroughly thoughtful expert review?
You are up against a lot of competition on this market and just throwing up a basic, rehashed review for each one is not going to cut it against the heavy hitters. I often tell new website marketers to REDUCE the number of products on the site and focus on developing fully developing the content and defining a niche audience for that subset. It is far better to do that than trying to start with a warehouse full of products.
-
RE: Home page optimisation
TW,
You should probably read this: Keywords and Keyword Optimization
-
RE: Alt tags
TW,
Don't keyword stuff the alt tags. The alt tag should accurately describe the image for people who have images turned off on their browser. Each image should have a different description and have a good reason for being on the page.
-
RE: How many pages should we optimise?
TW,
The limit to the number of kewords you can rank for corresponds exactly to how much time you have to put into each term/page. A page really isn't worth spending time on if you're not going to go all the way with it and all-out optimization for a page can take considerable time--not just on the page but in keyword research, as well. Knowing whether to go for a single keyword for a product page or multiple depends on what your research determines.
It's interesting that you said "each page with a unique keyword" instead of "each product". Do you have more than one product on each page? Do you have all of your products on each page? Some of your products on each page? Category pages?
FYI, I always recommend starting with the lowest hanging fruit. That way you get the satisfaction of seeing some results of your early work while you're putting the time in on pages that are more difficult to optimize, rank and get traffic for.