If they insist on having brand information, then yes, the alternative is to have a small portion of brand information, with a link to the full brand text on its own page.
Posts made by AlanBleiweiss
-
RE: How to handle brand description on product pages?
-
RE: IP Change
if the domain name, and URLs are identical to what they were, and if there are no security issues with the server the site is now located on, changing an IP alone should not ever cause a loss in rankings. Something else is going on.
-
RE: How to handle brand description on product pages?
Nitin
200 words - what is the point / value of having that repeated on thousands of pages? It's not unique, and regardless of what some people think about it being okay because "lots of sites do it" or because "Major brand that's able to get away with lots of bad SEO because they own a market" can do it.
If there are two hundred words based on non-product specific information, this is not a best practice. Instead, that information should be contained on just one page, and if you believe, from a user experience perspective, providing a link to that from each product page is helpful, that's what I recommend.
-
RE: Competitor outranking us despite all SEO metrics in our favour
EGOL,
Thank you for emphasizing the quality (helpfulness/human value). I only briefly mentioned it in my response, yet it really does need to be a top priority.
-
RE: Competitor outranking us despite all SEO metrics in our favour
Lou,
"I just wanted to throw a few factors out there in order to encourage a response like yours - packed full of useful next steps for me to evalaute this further."
THAT is priceless

Pagination:
Loading all content on one page and using a "more" button to "reveal" it, is not a best practice. Individual pages need to exist for individual sub-topic based content. This is especially true since it now appears that Google, while indexing content initially hidden to users, is likely giving less value to that hidden content than content immediately seen.
Pagination is important IF it is executed properly. If you have tens of thousands of results in paginated lists, is that one paginated group, or are they split out into separate groups based on similarity of content? If it's all just one massive group, that's likely another problem to look into, since pagination is meant to be used to say "these pages all contain links to other content where the entire group comprises very similar content around one primary topic".
Internal linking should always point more to main category page destinations than individual pieces of content. It would be unnatural from a usability perspective to link more to individual pieces of content, and thus it would be bad for SEO.
5,000 or so average crawl errors - what is causing those? Are they 404s? Were they previously valid pages? If so, those typically need to not generate 404 but instead be a direct 301 to a highly relevant live page (and where internal links within the site are updated accordingly).
So many more issues to consider...
-
RE: Competitor outranking us despite all SEO metrics in our favour
You are asking some very challenging questions, and using some very limited metric comparisons to try to figure it all out. SEO is not so easy. If it was, many sites would be in a continual state of leap-frog as they out-do each other in similar ways.
Here are just a few questions / considerations to add to your process:
1. Regardless of the number of instances of one or more keywords on a page, what is the total volume of highly relevant content on a given page? How helpful is that information in answering questions your specific target visitors are needing to have answered? How helpful is it in being able to allow visitors to achieve a goal they came to your site to achieve?
2. How well organized is your content in regard to very similar pages being grouped together in both navigation and URL structure? Since my reading of your question implies the competitor site is much more "tight" in its singular focus, this is a critical factor for your site to evaluate.
3. If their site is much more 'tight' in it's singular focus, how much is dilution a factor on the other pages of your site regarding topical focus and goal intent? If there is any serious dilution happening, you'd likely need even more content within that section you are comparing, to overcome that site's strength in refined singular focus.
4. What technical issues may exist on your site that you may not have considered? Crawl efficiency, page processing speed, canonical or duplicate content confusion? There are many other questions I could list with just this one consideration. Even if the competitor site has some worse signals among these, if any of yours are problematic enough, that alone can be a contributing factor.
5. How much higher is the quality of the inbound link footprint for your competitor in comparison to your inbound links footprint? Just having more links isn't at all a valid consideration if you don't dig deep into the quality issue? If they have 10% of the inbound link volume, yet half or most of their inbound links are from very highly authoritative sites and you have less of those, that is another massive consideration.
Those are just starting point considerations.
-
RE: Old school SEO tools / software / websites
Wordtracker for keyword volume and Overture PPC for keyword value were my two go-to resources. And WebTrends for the painful process of attempting to figure out what was happening on-site.
-
RE: Getting links on old blog posts
EGOL,
As always, you infuse wisdom into this discussion. I have always been an advocate of "content first, content last". Yet in 2015, search engines are only one piece of the puzzle, and until and unless other efforts for brand visibility / authority / trust are made, the overwhelming majority of sites on the web will leave way too much money on the table.
I happen to believe links need to be generated through our own efforts yet it's not the "traditional" link building. Instead, it's more about advocacy of brand, community service, and participation in the community in which our prospective/existing clients/customers live.
If we are not active in those ways, we build a house on sand.
Just my take on it.
-
RE: Getting links on old blog posts
The reason for the skepticism is the scale of spam out there, and the volume of ways spam efforts attempt to trick search algorithms. Google, even now, all these years into it, still does a very poor job of trapping some of that noise, and so the index remains polluted.
Of course, just building great content is never enough and won't ever be enough. So we just need to check the boxes regarding the potential for Google to think "this isn't legitimate".
-
RE: Delay between being indexed and ranking for new pages.
This is one of the million questions we face dealing with a less than clear message from Google on what they do.
Generally speaking, just one scenario is that they need to get confirmation signals when new content is discovered. Unfortunately that's not something that always happens, yet it does happen. The stronger a site is long-term, the less likely that will happen, yet even then it can.
-
RE: Getting links on old blog posts
Are they legitimate placement? Meaning - are the posts you seek links from real, quality, and relevant posts, and not on sites that are created for spam purposes?
Are you asking for a link and NOT specific anchor text, and NOT the wording they would use?
If the above scenario is what's happening, it's valid to reach out this way. As long as you leave it up to them to decide whether to include your content or not, and decide what they write, and what anchor text to use, and there is no reciprocal exchange, and no paid aspect, you "should" be fine.
Of course, it's impossible to know what some poorly trained manual reviewer might think about them, however that's the only scenario where I'd be concerned in this situation.
And if all of the above criteria are met, then those links would be helpful to readers of those sites, and thus have a chance of bringing actual human users to your site. Which makes them valuable for many reasons, one of which is SEO.
-
RE: New Website Old Domain - Still Poor Rankings after 1 Year - Tagging & Content the culprit?
SEO has become much more complex over the years, especially given how aggressive Google has gotten.
Unfortunately, it MAY be at least PARTLY the case where the bad links were weakening the overall trust of the site in a way that until the next Penguin update, you may not see value from that clean-up work. And even then, if other on-site issues, or not enough off-site truly high quality and highly relevant link and citation trust exist on a large enough scale, you may still be stuck in the weeds.
I poked around and here's my very initial take:
1. Critical Page processing inefficiency issues. Even though my one-time quick check speed test showed your home page rapidly loaded, Google's Page Speed Insights tool came back with your home page scoring a dismal 43 out of 100 points for desktop users, and 53 out of 100 points for mobile users. One-time actual time-based speed data is not enough to trust speed considerations. So scores below 85 in GPSI are a big red flag that you may very well have intermittent speed problems. And speed problems are a proven Panda contributing factor.
So I ran a SECOND speed test with a different tool and in THAT test, your home page took 29 seconds to process in a DSL emulator. Any time a page takes 20 or more seconds, that is an absolute, confirmed by Matt Cutts, ranking killer.
The fact that you have over 5 megabytes of content/resources and file sizes combined just for the home page is only one of potentially several factors why that is a very bad problem.
2. You don't have a traditional "services" silo(funnel) on your main navigation. You offer services, and yet that information is buried on pages that are not dedicated to any specific service type, such as "Boston Wedding Band" or "Boston cover band". So even though your page Titles on main nav pages use those words, the pages themselves are not refined enough in focus for those phrases - they're more broad in the content focus.
3. Your blog posts are fully included in the main blog index page view, so that causes duplication of content between that page and the actual individual post pages.
4. You have the business address in your page footers, but that info is not wrapped in Schema.org markup for local business info. Schema is now critical as one part of overall SEO (this was confirmed just this week by Duane Forrester from Bing during Pubcon - he said "you need to use Schema, you do NOT want us having to figure it out".
5. Have you checked your local listings consistency? Moz Local is very good at that. It's yet one more piece of the puzzle.
6. Regarding the old content - generally speaking, yes, very old, thin or low value content is also another consideration from a Panda perspective. Does it make sense to just kill those entirely? Maybe. Maybe not. Maybe there's a way to salvage those - through consolidation and 301 redirects, perhaps. It's not a simple, absolute process to just kill them off without understanding the complete picture.
SO...
I've only scratched the surface here, so while your specific initial question may be a factor, you have many other critical flaws in the site specific to the main pages of the site itself, and those are the most important pages.
As painful as it is to have been burned/disappointed by past SEO "professional" services, and where you may be able to muddle through getting back on track, I'm very happy to see you are reaching out here at Moz. It's a great community and several people are very willing to help where we can, when we can right here.
-
RE: SUMO Me / Social Sharing / Like Buttons
I agree with BOTH of the above suggestions. They are equally important. And be sure to test speeds across multiple testing tools
-
actual Google Analytics "behavior"/"page speeds" numbers (look at the 30 day site-wide average, AND the 30 day speed average for individual pages where you want to put that code).
-
WebPageTest.org - run samples by choosing WPT servers in locations where you get most of your site traffic; test for DSL and mobile device emulators, and Chrome as the browser (options on the first screen in WPT). Then look at the "first view" / "fully loaded" time - do it for a sampling of pages on your site. Also look at the alpha-grades WPT shows for each (a scoring system where "A" is very good, and "D" or "F" is a fail, and where "C" is a red flag that you may have intermittent problems.
You can then go to WPT's "Details" page in that report and see line by line every single process that page runs and where there might be slow-downs for any one or more process.
-
-
RE: Google index new data from my website page
If you have your sitemap xml file(s) set up properly, you can resubmit them each time you update that specific content. If the site is very large, I would suggest having a separate sitemap file just for those review pages within the site, and resubmit that one specifically. That can help motivate Google to recrawl that content sooner.
Also, do you have "last-modified" meta tags set up? That can help as well.
Depending on how high the quality of the content is, it can also help to send other signals
-
- Update the home page with a link to the newly updated review right in the upper portion of the home page's main content area.
-
- Consider a quality, not over-optimized, press release you distribute through a trustworthy release site - where you issue a press release describing the full review and only linking ONCE in the body of the release, directly to that review.
-
- Tweet a link to the review page on the day you post the review as well. Now that Google is integrating Twitter more, that can further help visibility.
-
-
RE: Is using outbrains legal in Googles eyes?
It's not necessarily a matter of whether Outbrain is "legal" according to Google as a single consideration.
If the code is implemented in a way that doesn't redirect, and if that linking is not nofollowed, then that is in violation of Google policies. That shouldn't happen though.
Where the problem becomes more complex is in how Google's algorithms might process a site that uses Outbrain or Taboola or other similar services, and where the end result is that site's ranking signals decline.
Several scenarios exist that can cause this.
1. 3rd party "hey, here's a bunch of links to other places" widgets can often add heavy page processing delays - especially when there's code bloat, or when at the code level, several server calls go out to that 3rd party server network (and often to multiple different servers in that network), and where bottlenecks can come up over the web eco-system.
2. 3rd party widgets of this type can make it that much more difficult for search algorithms to separate out the on-site content (both visible and within code that isn't seen) from 3rd party, irrelevant, and often absolutely garbage-quality content contained in those widgets. This doesn't always happen, yet it can - and sometimes does cause topical focus confusion, leading to misunderstood topical dilution.
3. Users often click on 3rd party widget links of this type, yet many other users hate it - find it insulting, and downright obnoxious when the quality of those links, and the images they stick in the user's face are grotesque or near-porn in quality. That can sometimes then impact overall User Experience and weaken site quality and trust signals.
It's Outbrain and Taboola who are among the leading causes of ad-blocking now being a major problem for publishers and revenue. The lowest quality ads, especially those disguised as "related content" get geeks and nerds and intellectual site visitors boiling mad. In some ways they aren't as obnoxious as auto-play video ads, or fly-over ads that block reading, yet in quality terms, they are much worse. If the advertising industry doesn't clean up its act with quality, and if publishers don't do the same thing, the battle is only going to grow.
-
RE: Our web site lost ranking on google a couple of years ago. We have done lots of work on it but still can not improve our search ranking. Can anyone give us some advise
"On the category page - we are wondering whether we should also remove the right side menus ?"
Do you mean the left side menus?
If so, I can give you a simple answer and I can give you a more complex answer.
The simple answer is "if you link to categories and sub-categories that are not directly related to the category you are on at that time, it is at least somewhat of a distraction and dilution issue".
The more complex answer is "it depends, and without a full audit, I can't answer that because there are many other factors to consider, some of which are purely User Experience, some are SEO and User Experience, some are crawl allocation related, and some are pure technical considerations".
-
RE: Our web site lost ranking on google a couple of years ago. We have done lots of work on it but still can not improve our search ranking. Can anyone give us some advise
I am glad to hear you will work through the issues. Be aware that there is no guarantee that these things alone will do the job, however each is an important step in the correct direction.
-
RE: Panda penalty removal advice
I'm also curious to know whether you've monitored Bing/Yahoo value over the course of your work. While it's rarely anywhere near Google's potential volume, I've seen good value gained from those as clients have implemented recommendations, even when Panda was a prime issue (and the subsequent panda refresh was a problem).
Overall it does sound like you're on the right track though.
-
RE: SEO and dynamic content
yeah their tests were a big help in my own confidence level. Just understand that its still wise to test as soon after live-launch as possible. I hate nasty surprises that come from Google's system...

-
RE: SEO and dynamic content
Ah yes - that's a challenge. From experience, I know that Google is "mostly" good at seeing and factoring in dynamically generated content when a page is coded to ensure the dynamic content is inserted into the DOM (Document Object Model). There's even a good article about this concept over at SearchEngineLand - written by Adam Audette - someone I respect as an advanced expert in the industry.
However, even though that's the case in my experience and based on what Adam's team found, the only ultimate test I personally trust, really is Google's fetch and render system.