no problem
Posts made by bradwayland
-
RE: Anyone know of companies or individuals that optimize google adsense for small players?
I'm not looking for PPC help at all. My question is about Adsense not Adwords. I'm a publisher. Moz does not have a list of adsense consultants. I think you aren't understanding what I'm looking for.
-
RE: Anyone know of companies or individuals that optimize google adsense for small players?
I visited the site. It doesn't appear that they even market that they work on optimizing Adsense. I'm not a novice at adsense. I'm looking for people who go beyond someone with 6 or 7 years of experience doing the typical fine tuning. Correct me if I'm wrong.
-
Anyone know of companies or individuals that optimize google adsense for small players?
I think I've done most of the obvious things like maxing out the ads per page and adding channels but I still think that someone who really knows this stuff could probably increase earnings by 50-100%. Let me know if you have ideas.
-
RE: How can I tell Google not to index a portion of a webpage?
So what should it look like in the code?
If my area to block was a product description it might say
"Product Description
bla bla bla bla bla bla bla bla bla bla bla bla bla bla bla bla bla bla bla"
Secondly, in the robots.txt if I disallow /iframes/ then I would need to make sure we are not using iframes anywhere else?
-
How can I tell Google not to index a portion of a webpage?
I'm working with an ecommerce site that has many product descriptions for various brands that are important to have but are all straight duplicates. I'm looking for some type of tag tht can be implemented to prevent Google from seeing these as duplicates while still allowing the page to rank in the index. I thought I had found it with Googleoff, googleon tag but it appears that this is only used with the google appliance hardware.
-
Is there a tool that works like crawl test that allows more than 3000 pages?
I enjoy using crawl test inside moz but I need to find a way to crawl all the pages on a site. It would probably be in the neighborhood of 10,000 urls. Does anyone know of a free tool and if not is there a paid tool that will do this?
-
RE: Has My Site Been Hit by Panda 4.0?
Were the 300 listing pages ever receiving traffic? If the answer to this question is no or if a significant number of them weren't ever receiving any traffic it might change what I would think you should do. Poor title tags will hurt your visibility in lots of ways. I would not personally tie the title tag strategy to Panda. Panda is a content algo. It seems to look for duplicate, near duplicate, thin content, poor quality content and then make sure that sites with those criteria are not ranking. If you think more broadly about it you might ask why Google would want to take the whole site down in the rankings for thin content on a few or many pages with potentially low or no traffic. I think the reason they are penalizing the whole site is because they don't want webmasters producing this type of content. If they can get content creators to think twice before creating another 20 urls about x topic then over the long haul their job will become much easier. They can fight off spam more easily because it won't work. I was very angry when Panda 4 rolled out and some sites I own got hit. However, I feel empowered now to correct the issue. My suggestion for you is to compare the urls with their links and traffic. There should be some clear cut low quality stuff that you can noindex. On the pages that drive traffic I would make sure you are providing deep, helpful content. Hard to discuss all the things you may need to do over email but I think you are probably getting the idea. PM me if you want to chat more.
-
RE: Has My Site Been Hit by Panda 4.0?
If you saw a drop around May 19-20 then I would say it is almost certain that this was Panda. I've been in a similar experience. I own a network of content sites that had never been clearly hit by Panda on the day of launch until Panda 4 where they got hit very hard. When you consider Panda I would think of it less like an algo change and more like a filter. Panda seems to crawl the site looking for certain criteria. If your criteria meets these then all your results get filtered. Remember that Google has clearly stated that their goal with Panda is for sites with thin or duplicated content to not rank high on the page. In my case I am seeing that I still rank for everything that I used to rank 1, 2, 3 for but now I rank 8, 9, 10. I have no proof yet that I can get out from under it but I am making a lot of changes now that I believe will do the trick to fixing my own traffic issues. I found this article to be very helpful as I considered what I should do next
http://cognitiveseo.com/blog/5536/google-panda-4-0-topical-authority-content-update-2014-case-study/
If you want to reach out to me privately I'd be happy to discuss in more detail.
Brad
-
RE: Best practice for a website where the publications (catalogues) expire frequently
I think this assessment is spot on. I think the superior strategy would what you mentioned at the end. If you updated the content continuously I believe you would create the most value for the user.
-
RE: Google Penguin 2.0 - How To Recover?
Actually, I would agree with Alan. It would be best to try to get links removed first and then use disavow. As for the reconsideration requests I am picking up on a great deal of cynicism regarding these. Maybe this is just a strange coincidence but nowadays it seems that people always think their loss in traffic is penguin or panda. I actually had a situation where a site lost a bunch or traffic in late April of last year. Of course no one thought it was a manual penalty but in the end it was. After reviewing the information we didn't believe it was from the algorithm changes but a penalty. We did very little work because we weren't really aware of any wrong doing. Then we submitted forreconsideration and 3 days later received notice that there was a manual penalty and it had been removed.
Maybe this was a poor recommendation but I do believe that many people are trying to connect every loss of traffic to Panda and Penguin.
-
RE: Google Penguin 2.0 - How To Recover?
My recommendation would to be to do the following items.
1. disavow all the links that you believe came from this practice
2. contact all the sites after disavow and ask them to remove the links to your site
3. submit a resubmission request through webmaster tools. Penguin 2.0 is not a manual penalty but in this case it would be good to alert Google that your site was hit hard but also you may have a manual penalty. I would want to try to fight against penguin 2.0 if it is possible that it was a manual penalty with strange timing.
4. change your strategy and start working on creating good content and earning good quality links.
Good luck!
-
RE: Is anyone noticing if penguin 2.0 as been launched
Yes, here is a link to Cutt's post about it.
http://www.mattcutts.com/blog/penguin-2-0-rolled-out-today/
He is reporting 2.3% of queries affected. I didn't know what to expect when I heard it would be 2.0 but based on the history of penguin it doesn't surprise me that it was not nearly as large as the big Panda updates.
-
RE: What is the best link delete service?
Why are you choosing to delete the links? My recommendation would be to simply use Google's disavow tool to let Google know which links you don't want counted toward your site.
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=2648487
Is there some reason you want them actually deleted? Using disavow is a best practice and will be much quicker and cheaper.
Good luck!
-
RE: Need to know best practices of Search Engine Optimization 2013
Agreed, I shouldn't have left them off.
-
RE: Need to know best practices of Search Engine Optimization 2013
My short answer would be to use the SEOmoz toolset to uncover some of your basic issues and work to correct them. From there I would start working on developing and building content on your site that can earn links. Earning links will require a significant amount of outreach to raise awareness about your content. I would also encourage you to focus on the user and what is good for them above your own thoughts about SEO. It is a delicate balance but building things just for SEO is never a good solution.
Here are some things I would encourage you to look at
- All Whiteboard Friday's
http://www.seomoz.org/blog/category/whiteboard-friday
- All recent Matt Cutts videos from the google webmaster youtube page
http://www.youtube.com/user/GoogleWebmasterHelp/videos?view=0
- If it is in the budget grab a ticket to mozcon in July
Hope this helps.
Brad
-
RE: Too Many On-Page Links
Make sure you check out this
http://www.mattcutts.com/blog/how-many-links-per-page/
I guess the larger question has to do with the point of having so many links in the footer. I think Google has overcome most of their issues with lots of links on the page but the main question I would have revolves around what is good for users. If it were my site I would probably pare down the links some to make it as user friendly as possible.
Your domainrank of 61 probably has more to do with your incoming link profile than your on page factors. In my experience I would not correlate these two items so closely.
If Moz is telling you to reduce the links then I would probably consider reducing them but at the same time if having more links is better for your users then I would probably stick with it.
Hope this helps.
-
RE: Submitting URLs to Bing and Google
I don't believe submitting URLs is a strategy if that is what you mean. I rarely submit URLs but if I find a page or have a new page that I want to get crawled as quickly as possible then I will go submit. I certainly don't have any proof of how much this has helped over the years. I do believe it notifies Google and Bing to crawl.
-
RE: Re-code website and start from scratch?
I can see how changing the domain name might help you with the EMD update but I'm not sure I understand how this will solve your panda/penguin issues. What is it that you will do with a new code base that you haven't done by changing content, meta tags, meta titles, etc? I'm not sure how substantial the site is in terms of revenue etc but unless you were doing some really spammy stuff I would continue working on your current site and probably even domain. I think you might want to take your SEO hat off for a while and start focusing on your users. Does the new content help users accomplish what they need better? Ultimately the only long term strategy would be to change your focus and produce the highest quality and most helpful content you can come up with. In many cases Panda and Penguin penalize people that are too focused on the SEO side. Considering your users will help mitigate this issue. It's a long road but many times starting from scratch is the longest road you could possibly come up with.
-
RE: Canonical tag in the Michael Torbert SEO plugin
I think your point is valid here. I think we need to break down canonical a bit for this one to make sense. The canonical tag is not a redirect. It is a suggestion, not a directive. http://support.google.com/webmasters/bin/answer.py?hl=en&answer=139394
In this case you are correct. The site owner is telling Google that he wants the www version to be displayed in Google SERPs. And if you go and search for xquisite events you will find that the first several results are using the www just like he requested. What is strange here is that the site owner has a 301 redirect pointing the www to the non. It seems to me that the site owner accidentally used the canonical on the wrong page. Instead of placing it on the pages he did not want to be primarily indexed he did the inverse.
Based on his redirects his REAL primary pages are non www but his canonical usage is suggesting the opposite. Hope this helps.