Sorry, I should have clarified. The redirect needs the full path of the image. The actual redirect would be something Ike:
Redirect 301 /images/15985.jpg(.*) /images/15985.jog
depending on where the image actually lives.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Sorry, I should have clarified. The redirect needs the full path of the image. The actual redirect would be something Ike:
Redirect 301 /images/15985.jpg(.*) /images/15985.jog
depending on where the image actually lives.
They absolutely need to removed as quickly as possible. You are in the right and that company is just doing what companies do and protecting themselves. If the articles on MOZ aren't enough to convince your client, here's one from Forbes... maybe he'll listen to that one: http://www.forbes.com/sites/joshsteimle/2013/10/09/seo-rankings-tanking-check-for-bad-incoming-links/
A large portion of my job with new clients is now link cleanup and disavows, because they suffered this kind of penalty with who was doing their marketing before us.
The Googler said it didn't matter as long as it was consistent, but I think Vadim has some good points below.
Yes, the way it's usually set up, screen readers for people with disabilities can still read the hidden H1s, so you won't have that card to play against them.
I have seen this done in order to help screen readers, but I definitely do not recommend this. They may argue that semantic rules of HTML 5 dictate that this is an okay practice, and it is for HTML 5, but not for Google.
Google doesn't adopt new rules or semantic changes right away. Changes have to take hold and become common practice. This isn't common enough, and hidden H1s do open you up for potential penalties.
Ha, yeah. This job would be so easy if only clients weren't a factor. The ones that listen are always the ones that have more success.
Good luck convincing your client. Keep your cool, it can be frustrating when clients force you to let them shoot themselves in the foot. This is because once their foot is bleeding, they're going to blame you for the pain.
Imagine you are a user. If you're searching the singular, you're probably looking for a SERP with websites of single venue locations to browse through. If you were searching the plural, you're probably looking for websites that aggregate, list, rank or otherwise provide you with a predetermined group of venues. So, if you are a single venue trying to rank for "venues" you're always going to struggle against those sites that naturally use the plural.
With that said, yes, optimizing for a singular will usually give you some juice for the plural as well, but not as much. If you're starting from scratch, I would recommend going for the lower competition, more relevant key term first.
And remember, more traffic doesn't always mean more results. Targeting keywords without the proper searcher intent is going to get you traffic that doesn't convert.
Have you tried:
redirect 301 15985.jpg(.*) 15985.jpg
This should capture every string that starts with the image name.
Hope that helps.
Consistency is key. I would leave the 4+ off. From my experience and from talking with Google employees, the more identical the better for the actual NAP. A Googler who works in that area told me even the difference between "blvd" and "boulevard" can possibly have an impact.
And when it comes down to, "Who do I make happy, Bing or Google?" The answer is almost always Google.
You'll notice some citation sites specifically request unique content for the descriptions. This is most likely so they don't encounter duplicate content, but it's a good practice. I know it's a hassle, but unique descriptions enable you to tailor content to the specific citation site, and you can surround your links (even if most are no-followed) with different contextual words they may be picked up on.
It never hurts to do a little keyword research and do some optimization for blog posts, just don't go overboard and force keywords into a blog post for ranking's sake. As long as it's natural in the writing or headlines, it's almost always good to target larger groups of the traffic you want, as opposed to smaller or non-existent groups.
After some testing, we found WPEngine to be the best solution for Wordpress site speed. They are a hosting company that specializes in Wordpress and they use a lot of caching and CDNs without you needing to set anything up or worry if things are working properly.
I'm sure similar results can be duplicated by building out your own caching and CDN, this is just another option for you.
It depends on how quick the promotions are. If the promotions really are short-term, I like to create a "specials" or "promotions" page where they can all be aggregated along with some unique content. Pages like that can get crazy traffic and even links. Visitors love them.
Then, you can have those individual promotions pages branch off of that main one. 301s to the promotions page would work. Or, if possible, maybe creating brand new pages for each promotion isn't the right way to go about it, and those promotions should ONLY live on the main promotions page.
I make sure to disavow these as well. No need to disavow each link individually, just knock out the whole domain in the disavow.
This became common practice for me once I had to clean up a manual action penalty that wouldn't go away until the scrapers were disavowed.
Thanks for posting the follow up. I'm really glad I could help.
I hear bluehost is okay. I've tried a lot of hosting companies over the years, and the majority have that arrogant attitude. Charging $60 per 301 is insane though. If you decide to move your site, I recommend having a professional do it, or it could go very wrong.
Happy to see your ranks recover 
I hate to speculate on anything involving SEO, but I've always taken those 404s as visits Google has been able to grab data for. If Webmasters is able to catch the data for a visit to a 404, it'll let you know about it.
What lead me to this cringe assumption cringe was how similar those 404s were to existing pages, like someone tried to type in a URL and got it wrong, or deleted some of it and hit "enter".
Take the info for what it's worth, which isn't fact, just an idea to get you rolling.
Depending on the amount of reviews you need to mark up, this tool may be all you need: http://schema-creator.org/review.php
Just embed the generated code and you'll be good to go, or it will at least give you a good start.
There are also Wordpress plugins that allow this sort of generation in the backend.
Hope that helps.
Unless I'm mistaken, frame forwarding is masking: http://en.wikipedia.org/wiki/Domain_Masking
In my experience, I still consider it to have some weight, but not enough to beat out a better resource that is producing better content and getting more links.
I don't think age is a top-level factor to consider, but it has enough weight to get a leg up on a property that is similar in status.
The difference between a 4-year domain and a 1-year should not be a factor in this case. Something else is in play, and it sounds like it's time to dig even deeper.
This looks extremely spammy to Google. It makes for a bad user experience, since you're trying to trick the user into believing they are going to one site, when they're actually at another. Even if this isn't your intent, Google doesn't like it. Not a guaranteed penalty, but a way this set up could possibly hurt you.
You won't get any value SEO-wise, any keyword indicators the domains may have directed to your site should already be on your pages anyway (if freshly bought domains being 301'd were indicators in the first place).