Good point Chris... I know, old school - refreshing my skills!
Going to remove the boldening, just needed to know if it could be causing a penalty or not.
thanks,!
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Good point Chris... I know, old school - refreshing my skills!
Going to remove the boldening, just needed to know if it could be causing a penalty or not.
thanks,!
It used to work for me on some sites - but maybe it's considered spammy these days?
Any feedback appreciated.
Thanks.
Our landing pages are both - depending on google - we rank for some category pages and rank for products also - the content and keywords are sprinkled without overspamming in the content, so it should make semantic sense what our pages are about.
I think I really need to work on the nav - I think link juice aint flowing.
--- your product descriptions are duplicated on other sites and your site has been hit by Panada or the age-old duplicate filter (YES, they are - We are writing new content that is not Manufacturer description copies - e.g., our own unique content - in the meantime, I am META NOINDEX, FOLLOW)
--- you are working on a nascent site that has no links and that nobody is tweeting, liking or citing (Partially correct - we are tweeting and liking it via various social network accounts of our own)
--- you have a bad link profile and have been hit by Panda
(I cant see any "bad links" from unreputable domains - Our link profile is pretty poor and we only have a couple of external references from other websites and nothing too spammy)
So, what do you say about the three items above?
Also, those 70 visitors. What are their referring domains, or where are they coming from? We are getting exposure in Google, so mostly traffic coming from Google but not enough
If we were hit by a Google penalty, what would it look like?
Hi everyone,
I hope you have a couple of mins to give me your opinion. Ecommerce site has around 2000 products, in english and spanish, and around only 70 hits per day if that.
We have done a lot of optimisation on the site - Page Titles, URL's, Content, H1's, etc.... Everything on page is pretty much under control, except I am starting to realise the site architecture could be harming our SEO efforts.
Once someone arrives on site they are language detected and do a 302 to either domain.com/EN or domain.com/ES depending on their preferred language.
Then on the homepage, we have the big MEGA MENU - and we have
CAT 1
SubCat 1
SubsubCat 1
SubsubCat 2
SubsubCat 3
Overall, there are 145 "categories". Plus links to some CMS pages, like Home, Delivery terms, etc...
Each Main Category, contains the products of everything related to that category - so for example:
KITCHENWARE
COOKWARE BAKINWARE
SAUCEPANS BOWLS
FRYING PANS
Kitchenware contains: ALL PRODUCTS OF SUBCATS BELOW, SO COOKWARE ITEMS, SAUCEPANS, FRYING PANS, BAKINGWARE, etc... plus links to those categories through breadcrumbs and a left hand nav in addition to the mega menu above.
So once the bots hit the site, immediately they have this structure to deal with.
Here is what stats look like:
Domain Authority: 18
www.domain.com/EN/
PA: 27
mR: 3.99
mT: 4.90
www.domain.com/EN/CAT 1
PA: 15
mR: 3.05
mT: 4.54
www.domain.com/EN/CAT 1/SUBCAT1
PA: 15
mR: 3.05
mT: 4.54
Product pages themselves - have a PA of 1 and no mR or mT.
I really need some other opinions here - I am thinking of:
But I am willing to hear any other ideas please - maybe another alternative is to start building links to boost DA and linkjuice?
Thanks all,
Ben
Thanks Paddy! - great answer
Thanks Mike!
Would love to fill out that spam report - but I think I'm gonna go with the solution below - the domain in question has a lot of natural links, but no keyword specific links in directories - whilst the other site is built solely on directory links -
Hopefully a few directory links will push the balance in our favour!
Thanks Takeshi!
I guess my next question is how do I obtain authoritative links? Thanks!
Since Google's latest updates, I think it would be safe to say that building links is harder. But i also read that Google applies their latest guidelines retro-actively. In other words, if you have built your ilnking profile on a lot of unnatural links, with spammy anchor text, you will get noticed and penalized.
In the past, I used to use SEO friendly directories and "suggest URL's" to build back links, with keyword/phrase anchor text. But I thought that this technique was frowned upon by Google these days.
So, what is safe to do? Why is Google not penalizing the competitor?
And bottom line what is considered to be "unnatural link building" ?
You probably know what I mean, the report in Google Webmaster Tools > Sitemaps.
So how do I locate the pages that are NOT indexed?
Thanks,
Ben
Hi,
I am just wondering as to the accuracy of this report - does it pick up all the duplicate on page content? Or is there a limit?
We have an ecommerce store with a lot of copied and pasted descriptions - just wondering if there is a limit on how much the moz crawler picks up? In other words, once we fix what MOZ has detected, will there be more detected because it is limited to display say up to 200??
Hope you understand what I mean.
Thanks
Hello everyone on the new cool Moz!
I've optimized sites before that are dedicated to 1, 2 or 3 products and or services. These sites inherently talk about one main thing - so the semantics of the content across the whole site reflect this. I get these ranked well on a local level.
Now, take an e-commerce site - which I am working on - 2000 products, all of which are quite varied - cookware, diningware, art, decor, outdoor, appliances... there is a lot of different semantics throughout the site's different pages.
Does this influence the ranking possibilities?
Your opinion and time is appreciated. Thanks in advance.
OK.
I've optimized sites before that are dedicated to 1, 2 or 3 products and or services. These sites inherently talk about one main thing - so the semantics of the content across the whole site reflect this. I get these ranked well on a local level.
Now, take an e-commerce site - which I am working on - 2000 products, all of which are quite varied - cookware, diningware, art, decor, outdoor, appliances... there is a lot of different semantics throughout the site's different pages.
Does this influence the ranking possibilities?
Your opinion and time is appreciated. Thanks in advance.
Thank you to everyone who helped clarify this issue.
I thought about becoming an editor, although, from what I read, the SEO value of having a DMOZ link may not be as strong as it once was...
Great info!
Thanks @Baldea - yes, you are correct, as soon as it processes the requests for removal it allows for more to be added for removal.
In "one sitting", I managed to get 1000 URL's removed - within a few hours I could process more.
Hi,
How long does it take for DMOZ to process a suggest url?
Thanks,
Ben
Thanks Baldea....
Yes, I have done all the above, but still got some pages stuck in Google Index that got in there - now measurements are in place to stop that happening in the future.
But using batch process of GWT to remove the URL's that already got into the index.
So how many URL's can I remove with GWT?
I'm dealing with thousands of duplicate URL's caused by the CMS...
So I am using some automation to get through them -
What is the daily limit? weekly? monthly?
Any ideas??
thanks,
Ben