Do they deserve their own pages or can they be combined into one? If they are so similar that you can't explain them in different terms, you are probably better off combining them into one page.
Posts made by OlegKorneitchouk
-
RE: How does one write different pages of their website that are very similar in nature with using too much duplicate content?
-
RE: Syndicated content outperforming our hard work!
The question: does anyone have a guess as to what is making it perform so well?
You have a stronger link profile but I think they are winning the SERPs because they post "Recent" links on their homepage that link to news and user submissions. This in turn lets crawlers syndicate the latest submissions quicker, their homepage is crawled more often, and they rank quicker/better because of the Query Deserves Freshness (QDF) factor.
I recommend you try doing the same thing and see if that helps you.
--
I also only found 5 instances of your articles being sourced - https://www.google.com/search?q=site:accidentin.com+intext%3Afindmyaccident.com
What kinds of kw are they outranking you for? Do you have a rss feed or how are they scraping you content?
--
In general, scraper sites are not supposed to do well and will probably lose value but I've seen several examples where they are performing really well.
Cheers & Good Luck,
Oleg -
RE: Google suddenly indexing and displaying URLs that haven't existed for years?
Is Google suddenly ignoring rewrite rules and redirects?
Shouldn't be.. pretty odd. You can try blocking the crawler from accessing the old .jsp pages if they all follow a format (below code is if every page starts with /xref_)
User-agent:*
Disallow: /xref_*Looks like you don't really need a RewriteRule line there.. just a redirect would do the trick
Redirect 301 /xref_interlux_antifoulingoutboards&keels.jsp /userportal/search_subCategory.do?categoryName=Bottom%20Paint&categoryId=35&refine=1&page=GRID
But I don't think that is the problem since its still sending a 301 response code when you visit the .jsp file.
One thing that may help is adding canonical tags to your current pages - make sure you utilize rel=canonical as well as rel=next/prev for your paginated pages.
Overall, I'm not sure =/ Try posting/submitting it to G, could be a bug.
-
RE: UX Design: Do Directional Cues Help?
Arrows, Colors, Whitespace, Images
A combination of those usually gives the best direction cue to visitors.
People are attracted to images/colors so that is where the eyeball jumps first. If you have a loud page, people's attention is going to jump all over the place (which is when arrows come in most handy).
If you have a clean page (lots of whitespace), any graphical/colorful element is going to pop and attract attention.
Best advice I can give is to Always Be Testing. Set up experiment after experiment to optimize the visitor flow.
Cheers & Good Luck!
-Oleg -
RE: Does Adwords Use Broad On Results?
If for example my campaign was targeting the keyword 'flowers' and as a result the long tail keywords that went with that keyword would adwords show my advertisement to those searching black flowers or flowers for sale?
Yes. If you target 'flowers' broad match, you would be part of any query that contains the term 'flowers'
A further question is if I limit my area to say my state if it was in the US would the word flowers include flowers statename or statename flowers?.
Interesting questions. I know selecting a state means G would serve the ads to people who have a IP/GEOlocation that is within the selected area. But if someone search 'NJ flowers' from outside of NJ and you select to only display ads to NJ, I'm not sure if they will appear.
If google does count and show to the above does it count these when it provides results in the traffic estimator?
Broad counts all queries that contain the keyword you selected so yes, 'statename flowers' would be included in their estimates.
-
RE: What is the most current advice on editing your company Wikipedia page?
I've noticed a lot of incorrect and misleading information on our company Wikipedia page.
That just about answer's your question. YES! This is reputation management 101. The key is just not to turn your wikipedia page into a sales pitch. Whatever changes you make, you absolutely need to cite. The key to not having your edit's reverted: BE OBJECTIVE. Stick to the facts and nobody will mess with your fixes.
If you start talking about how "is the #1 yyyy company in the world as rated by www (which happens to be owned by yyyy)", you better believe that an unhappy customer/competitor may start a ruckus (and rightfully so). Just be fair, objective and scientific about the way you make changes.
Keep in mind that there is a conflict of interest and any changes will be scrutinized by the public so you really don't want to add fluff or seem really PRO-yyyy.
Keep your changes short, concise and don't completely change the article from the get go.
No sources but you should read the Wikipedia rules for organizations
-
RE: Google Preview not showing images
It's because you are blocking crawlers in via your robots.txt file - http://images2.spies.dk/robots.txt
Just make it blank and wait for your images to be indexed. Should appear in search after that.
-
RE: CDN and the Google ranking?
CDN is used to improve page load speed. Since this is a minor (but still considered) part of SEO and SERPs, the only time you would really see a change in serps from implementing a CDN is if you had a REALLY bad load speed which was drastically improved.
However, good on you for improving visitor experience

-
RE: Really bad technical SEO and Nofollow
Read this post by Matt Cutts on nofollow and pagerank sculpting
Essentially, you still divide your link authority between the total number of links on the page (including nofollowed links) and only pass authority to the followed links.
So you still pass just as much authority to the links that are followed while reducing the authority to the newly nofollowed link (which doesn't fix your situation from what I can tell).
You can possibly noindex the duplicate pages? I'd have to see specifics before I can say for sure.
Best course of action is a navigation restructure (maybe even site if each product is a post (timely)... should at least be a page(timeless)).
Good luck!
Oleg -
RE: Google News Referral Traffic Gone after CMS Upgrade
We recently upgraded our CMS which now includes automated sitemaps (not sure if this impacts news sitemaps, though).
How did it work previously?
Did traffic go down in general? I know there G changed the way analytics interprets image/product based searches and just changes how they were categorized (still receiving the traffic, just labeled differently)
I also recommend posting your issue here and seeing if a rep can help get it resolved.
-
RE: Would removing or making non relevant links no follow boost a site?
Instead of removing links, just focus on building new links.
The only time I would remove links is if you were affected by Penguin or a manual penalty (i.e. a large part of your backlink profile comes from shitty sites). Receiving a link from a irrelevant site wouldn't hurt your rankings unless you do it en mass. Removing those links may hurt your authority.
Cheers,
Oleg -
RE: 302 redirect and NO DATA as HTTP Status in Top Pages in SEOMOZ Link Analysis
302 = temporary redirect. When the crawler landed on your page, it was redirected to another URL.
[NO STATUS] / [NO DATA] - try running it again? If it still shows up, you may be blocking crawlers, try fetching your site as G - also check your robots.txt, htaccess and meta tags to make sure you aren't blocking it from there.
-
RE: Cocitation with just address?
My question is: is only the address sufficient?
Yes, what Rand posted was theoretical and just posting the address would theoretically give you a boost as well (just thinking from a crawler POV). I would add a link where you could since that is still the best way to earn a citation.
How much of a boost would 20 or so cocitations a month give to a medium competition local search term in G places?
Current rankings? Authority of your site? Authority of the cocitations? Relevance of content w/ cocictations? Even if we knew all those, there are too many variables to say for sure. Only one way to find out

-
RE: What's the best way to handle crawling of photo gallery?
Looks like you're filters work via POST method so you don't have much choice. There are no URL parameters you can block. rel="prev/next" is the way to go to index all images and pages without duplicate content.
-
RE: How do i show my link xls file to google?
You're welcome!
Just to add some more credibility, the new Matt Cutt's video supports my initial statement.
Hope everything went well for you!
-
RE: MSNbot Issues
Yes, you can add that to your robots.txt file and it should slow down the crawl rate. I haven't tested it myself but have seen many instances of it. Let us know how it works out!
-
RE: 301 redirects from old site to new
Nope, that's exactly what you should do.