Best posts made by Travis_Bailey
-
RE: What to do about resellers duplicating content?
There was actually a pretty solid Whiteboard Friday covering a similar topic. The Googles are only 'pretty good' at figuring out which page is the original, and which site should be given better placement for the same content. So you're not being silly for being concerned. Panda.
I honestly think you'll find something you can use in the video, so I won't get too carried away on that subject.
Your next problem is buy-in. Will your client do what needs to be done? To get the sale you have to sell.
Present it as something that hurts resellers as well as the client. Which it likely is, or will, hurt both in a material way. If the resellers aren't being seen well enough due to the duplicate content issue, your client is losing on sales. The reseller is directly losing potential revenue.
So whatever you put together out of this, the wise will heed your words - as they're losing money.
-
RE: One-Pager and SEO
This may not be exactly what you had in mind, but why not handle it like any typical WordPress blog installation?
Just show a preview snippet with a link to the full profile. You would avoid duplicate content. You could have what you want in effect.
If it was my deal, I would go with a static home page that explains what the whole thing is about. There would be navigation to the Difference Makers. That would essentially be your post type where you showed a snippet of the profile that leads to a post that displays the profile.
If there's an actual blog in there for some reason, you could likely handle either or with a custom post type.
But if you don't like the static home page idea, I suppose you may be able to get away with a preview that links to an actual post/bios on the home page.
I would be interested to see what anyone else thinks. (Not sarcasm, really want to see what others think about it.)
-
RE: Is it OK to include name of your town to the title tag or H1 tag on a blog to enhance local search results
As The Chris Menke has stated, there's no problem with including the city and state in your meta data or on-page markup. It's actually an advantage of sorts.
There are nearly one dozen cities named Dallas in the US. Say you were located in Dallas, TX. What would separate you from Dallas, OR? There would be citations, if you're doing what you need to do for a local business. But I've found that on-page is getting a bit more weight, and it's increasing.
Not sure yet? You can verify a Google My Business listing via Google Webmaster Tools for an indeterminate amount of business categories. If that's not a sign that on-page is gaining weight in the local search scene - I don't know what is.
In fact, the Google Local team has stated that the business's domain is the strongest factor. Would you not like to possibly give them a few more signals, easily? They are the donkey, you have the carrot.
But I would suppose your question arises from an objection to writing; "Profession/Service City, State". There are ways you can do that, and it will still look alright. In fact, wouldn't you want someone to know that they came to the right place immediately, even though it sounds generic?
People don't have time. You have to show them they came to the right place immediately. Title and header markup are excellent ways to do so.
-
RE: Google Structured Data Testing Tool issue.
Your question has been up here for a while today, and no one else had anything. Perhaps our European brothers and sisters will add a bit later.
The itemscope for LocalBusiness is wrapped in an HTML tag. Usually Schema markup is wrapped in a
. Perhaps that's one of the issues.
I tend to keep it tight, on one page, in the body content. I've never had any issues that way.
-
RE: Website was hacked and is clean now. What to do next for Google?
TL;DR Follow Google's Guide on Recovering a Hacked Site
Orrrrr.... deze.
First, are you 100% sure it's clean? Some people get lucky, and have a clean backup. Others aren't so lucky.
Some hacks come in 'low and slow'. In other words, the site could have been compromised well before any outward signs became apparent. So definitely consult a security professional, if you haven't already done so. (Sometimes your host will 'clean' it for you, but miss critical points, and you're back in the same spot later.)
I suppose it couldn't hurt to remove URLs created by the hack. But if they're all gone, resolving as 404 ,they will drop out of the index in a couple weeks or sooner. The main thing is that the site is clean.
If you're sure it's clean, submit the site for review. The guides I've linked mention that it will take several weeks. And once Google has determined the site is clean, you should see any of the 'hacked' and/or 'malware' snippets disappear within a few days.
And I seriously recommend having a security professional review the site and hosting. A security professional will not only help you determine the cause of the problem, but also with mitigation of future risks. Best of luck.
-
RE: Why has my site dropped out of search rankings?
The main thing I noticed after running it through Majestic is that there wasn't much in the backlink department worth keeping. Sure there was a link from Google, DMOZ and something else. But the majority was basically directory spam.
So even if you managed to get the majority of the links removed/disavowed, there's also the fact that there's little else propping the site up anymore. And in all of the Penguin 3 aftermath, which occurred recently, there's the distinct possibility that low quality links which were still passing juice have been stopped. So there's that as well.
-
RE: Is it necessary to use Google's Structured Data Markup or alternative for my B2B site?
Schema (structured data) is supported by all major search engines in the Western hemisphere, and a two or three search engines elsewhere. The search engines, Google in particular, may use this structured markup to display various things about your business in search engine result pages. While it's not known to me as a ranking factor, it seems it would be advantageous to tell search engines everything about the business in a language they can easily understand.
Without knowing more about the case, I can't say in a 'yay or nay' sense whether it may be worth the effort. If someone in your ranks understands or can learn Schema markup, go for it, you have nothing to lose. Though there's no guarantee applying the markup will result in a unique snippet for any given query.
Ultimately, it seems prudent to do so. Though it may take a little time to figure out what markup you'll need.
-
RE: What are the lowest acceptable metrics for a link?
All of the numbers are borderline arbitrary. I would simply state; "Don't be stupid." That will get you further than anything.
-
RE: Webmaster Tools Verification Problem
That's odd that neither method works. There are still a few more methods you can try. You can find them here. If you could update how that works for you, that would be appreciated.
Whenever I see things like this, I always wonder how often people get fired for system problems beyond their control. At any rate, once you re-verify your account, the historic data should still be there. So no worries on that part.
-
RE: Referencing links in Articles and Blogs
I'm just going to leave this here. ; ) It would seem that all of the typical means of citation can be recognized as such. Perhaps too readily?
-
RE: What are best options for website built with navigation drop-down menus in JavaScript, to get those menus indexed by Google?
I would generally prefer CSS over JS for navigational elements, but that probably isn't the problem here. Google can crawl JavaScript and attribute links fine. And per SEM Rush, it looks like the site is enjoying a pretty sharp uptick in organic traffic recently. That would seem to be at odds with big indexation problems.
I'm not so sure if it's my network, I'm on a sub par connection now, but I noticed that some CSS and JS files were timing out when I crawled the site. That could lead to a big problem. I would advise that someone check the server log files and see if those files are regularly timing out. Ideally one would want their CSS and JS files combined/concatenated where possible, to reduce the possibility of any such rendering issues.
More on that from SE Roundtable
I checked the cache for the EN version of a few of those pages, and they appear to be cached fine.
cache:https://f5.com/products/security/distributed-denial-of-service-ddos-protection yields, which is pretty much what we want.
But I do see some problems that could lead to problems with indexation/display. The site has a number of different languages/translations. However, I noticed that the hreflang attribute was missing. It's strongly recommended that hreflang is implemented. You're good on the language meta tag Bing recommends, though.
That would cause some problems, especially on a site that large. I've researched Radware, their competitor, years ago. F5 seems like the type of organization that would pay for a decent translation. (my German and Spanish are so limited, I couldn't discern the quality of the translations) But if it is automatically generated, that would more than likely lead to indexation problems as well.
Another thing I see is that each translation is marked as canonical. This could also cause problems with display and link equity.

Here's more on internationalization from Moz and Google.
I would also look for ways to build internal links to the important products (DDoS Mitigation is supposed to be a huge money maker now.) on the home page, in the body. Not just in boilerplate (nav... footer... etc....) areas.
Edit: Forgot to mention that the mobile menu doesn't appear to directly link important products. I would make sure the experience is the same across devices.