If the product will be coming back, I would direct those platforms to simply notify the user on the product page - while keeping the product page indexed.
If the product is not set to return, redirect to a similar product.
Hope this helps.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
If the product will be coming back, I would direct those platforms to simply notify the user on the product page - while keeping the product page indexed.
If the product is not set to return, redirect to a similar product.
Hope this helps.
Drops such as the one you have experienced can be difficult to assess. I would advise the following procedures to rule out other issues first. As Egol correctly stated it is important to not jump to fixes right away, for two reasons. First of all, the situation could be temporary and could revert. Second of all, changes you make will obscure the potential issues, making it more difficult to find problem spots.
1. Check robots.txt to ensure there are no issues with file additions that could be blocking major pages
2. Check link canonical tag, if you use it, to ensure there are no issues there with incorrect urls
3. Check inbound links both using inbound link tools and Webmaster tools for any suspicious bursts of links or links that look dodgy you might not account for
4. Run a sitewide meta check on all titles and meta descriptions and ensure everything is correct. There are software companies that offer fairly inexpensive options that will spider the entire website relatively quickly. Do this late at night post-swell
4. Use Xenu to check all broken links and fix. Even if there are only a few.
5. Run the google bot indexing tool in GWT and check for any instances of funny code or potential problems
6. Analyze your analytics to determine which keyword clusters lots the most positioning. This can often give you clues as to what might have happened.
Hope this helps.
Todd
www.seovisions.com
The answer is really "it depends"
Ultimately it depends on a large number of things, a few of which are:
-The number of inbound links to the website and to pages of the website
-The authority of the website
-The value and presentation of snippets on the page.
the type of content being presented
-The source of the content snippets and number of times it is being used on other websites.
There seems to be various thresholds for duplicate content. Certainly the more authority websites utilize this "mashup" approach quite successfully, while others fail and are subsequently largely filtered off in the SERPs.
I would suggest utilizing larger content marketing campaigns geared at the industry / vertical authorities.
At the same time, be careful to place too much weight on DA as a metric of concern. DA is a result of high level link and content strategies.
It is great for comparative purposes, and serves as a fantastic benchmark, but search revenue, CRO, acquisition and retention for your your audience is probably more important.
Hi Richard,
Having the link positioned in the upper navigation will help ensure that the search engines view that link as important. Just as importantly, it helps consumers know that the link and information in the blog is important.
Placing links in the footer only remove them from visibility considerably, and search engines can detect link placement on the page, placing less weight or value on that link.
Also, if you are building high quality content in your blog to engage consumers then having that link in an appropriate area increases the potential of your audience finding that content, engaging with it, and supporting what you.
Good luck in business!
Hi Mnkpso,
Google often shifts the impact of each element / factor in terms of how they are applied to both local and non-local results.
However, that being said, experience shows that distance from the centroid does in fact play a role in local search engine rankings. David Mihm, one of the foremost experts in Google Local, has cited this here, in his study.
Additionally, we find that there is a correlation between the physical address distance from centroid and ranking.
It is important to note that distance from centroid is only one metric, and although a valuable and important one, in and of itself cannot entirely influence results.
Hope this helps,
Todd
Here's another:
http://www.w3schools.com/web/web_glossary.asp
The key David is in:
1. Brainstorming
2. Paring down / filtering potentials
3. Testing potential campaigns if possible (pre-promote, test the waters for response rate)
4. Create said content
5. Massively promote and outreach, share in SM
6. Feedback and analysis.
Like any campaign in marketing, it's all about planning well in advance, due diligence and small, incremental tests. I like to think of each of these campaigns as a "lean campaign" - to borrow the concept from many including Eric Reis.
Small testable campaigns allow you to pivot and change quickly if the results are not there, or there are assumptions that were made (hypothesis) that turned out to be less accurate than initially considered.
Hope this helps!
Todd
Agree 100% with David and Fredrico. Noindex, follow your tag pages.
Good value, solid DA / PA and nice co-citation links with other authorities. I would recommend it.
If the new pages are fantastic content, this will only help.
They will likely attract more inbound links if done well.
I would focus on ensuring that the link real estate (taxonomy, navigation) doesn't change in this case if you want to retain the rankings you have.
If the new pages are sub categories / children of the current pages, then link down to them from those pages, and back up to the current pages in breadcrumb (which will further reinforce those rankings)
Hope this helps!
Hi Michael,
I can sense your frustration but business and marketing are unforgiving. And while I do agree with your sentiments the best thing you can do is be absolutely 100% thorough in your approach to the problem.
I would recommend running the Screaming Frog and Xenu and re-evaluating.
Here is a handy checklist that you can do through to ensure you are covering bases, if you have not already.
1. Run Xenu Screaming Frog and examine in detail. Look for any pages which have outlinks and scrutinize those outlinks and ensure they are nofollowed. Be sure to include all subdomains, even subdomains you believe are noindex at the robots.txt level.
2. Check page code on 20 random pages in each subdomain to ensure that there are not hidden bits of code you might be missing that contain an external link. Check CSS and javascript while you are at it just to be certain
3. Check GWT to ensure that there are no warnings or alerts as to any hacking or suspicious page activity.
4. Thoroughly audit any and all widgets to ensure they do not contain outlinks, and if they do, assess as necessary based on authority of destination website and relevance / commercial natural of link.
5. Ensure any business listings on the website are nofollow
6. Nofollow any links (for now) from any areas where you published articles (I noticed here that you do allow for this option)
7. Nofollow signature links in the forums, if they are not already. Nofollow profile page links in the forum, if they are not already.
When you run these reports and checks, make extensive notes of what you are doing. Google is looking to see that you have put exhaustive effort into the process. Since you have control over link on your own domain, the level of scrutiny is higher.
On a side note, some of your important business pages seem to be hanging up on an internal redirect. The about us and privacy policy in particular appear to have a closed loop redirect.
Wishing you the best of luck!
Best,
Todd
Very nicely.
And by phone, where possible.
AFAIK Google is not using rel="publisher" often as a signal in the SERPs, and doesn't seem to use that (yet) to display in SERPs. You should use your G+ profile for this with rel="author", if you wish to benefit from that increased CTR in the SERPs, and switch to publisher at a later time when Google shows clear signals of using this in the SERPs.
Hope this helps,
Todd
Hello James.
We have done some controlled studies which do show a positive impact in rankings for particular pages within the website when based on a very controlled, or tight taxonomy. I believe this is because of the ability to develop internal links to pages that might not receive a large number of internal links in the website. Additionally, since these pages are in the breadcrumb but likely not linked from other places within the general template, first link attribution is a factor.
Also, it is important to understand the impact of breadcrumbs on conversions. There are numerous studies which show the positive impact of site stickiness and metrics like bounce rate, TOS and conversion rate from well defined and helpful breadcrumbs.
What's good for the user is generally good for the search engine.
Best,
Todd
I agree with Alan, and would like to add that I believe that using the silo method can increase the proximity of closely connected clusters of keywords better. In other words, by nature, in a silo structure, tightly knit keywords support each other and pass theme and relevance value to each other by default when a strong supportive breadcrumb is in place. Often with a flat site architecture extra programming needs to be done to establish those relationships as they relate to internal pages.
Ultimately, I would never make this decision based on PR. I would make the decision based on who could give me the most editorial based link that looked natural, on a page that looked natural, and on a website that does not appear to sell links. That would likely be the strongest link, and least likely to raise and particular red flags.
Page rank is one of hundreds of metrics. Developing a very natural looking, diverse backlink profile consisting of a healthy dose of editorial based links will also land you in desirable places.
In competitive niches we aim for 2 per page, in less competitive niches we aim for 3.
Hope this helps!
It's a possibility. There have been ongoing discussions on internal linking on most of the major SEO forums for quite some time.
My personal feeling is that most websites can effectively reduce the number of internal links by auditing their link flow, determining their "ideal" real estate and ensuring that they are not necessarily duplicating unnecessary links, which would steal some of the juice that might otherwise go to some of the bigger pages.
Only <5% of people ever see the footer in the average website, so my opinion has always been that the footer should contain supporting links to areas to help the user in "context". Contact, About, Sitemap and Investors, for examples, are classic links one might find there.
Big real estate - or important pages in the website - should be linked to from your main nav or areas above the fold with lots of user exposure.
Keep in mind when changing and removing links - it is a process. Do not go in and remove all or a significant part of your links in one week.
Make one or two good changes, then wait for a period of a week or so, then make others small changes over time
Hope this helps.
Todd
www.seovisions.com
Hi David, I feel your pain. This type of problem is more common than you might think!
Fortunately there are steps we can implement to help curb the effects.
1. You need a copy of the old website. This is probably available at old.toughtimeslawyer.com so that you can revert to the old website, or check for errors. So while they should have noindexed that domain, it is helpful to have it. I would recommend asking them to noindex it now.
2. You will need to map the old pages to the new, and then apply redirection and / or 404, depending on the page in question. For now, I would simply 301 redirect the old pages.
There are two ways to go about this:
A) Ask the developers to match the old urls on the new website. This is preferred, because it then means the same url is being used and no redirection needs to take place. I would ask them this first, but if they are not able to do this then:
B) Apply a 301 redirection schema to redirect old pages to the new corresponding pages on the new, live site:
Depending on how many pages you have on the site (sites over 100 pages you might consider using tools for assistance) you will want to create an excel which maps
Column A - old URL
Column B - new URL which now holds this information
Your job is to match the old content on the old subdomain with the new, live content.
Once this is completed, and triple checked, backup everything and then ask the development company to apply 301 redirection from the OLD urls to the NEW urls. This passes some of the link weight, age and authority over to the new url, and tells the search engines to place emphasis on the new urls for content.
3. Once completed, I would run the following checks (or hire someone to help you with this)
-Redirection check (ensuring those redirects are correct by checking header responses)
-Broken links check on the new website
-Navigation and internal links check
Depending on the gravity of the design change, this might not completely restore rankings. Ranking in Google based upon a myriad of factors, and site quality and layout is one of them. There is much more to the overall process, but the information above should help you resolve the current issue as much as possible.
Changing designs is a complex process and should be always proceeded with caution.
Once you make the changes, I recommend a period of about 2 weeks to monitor changes.
Hope this helps!
Good questions, it's great to see that you have taken the time to carefully consider each step.
Ranking on a national level in organic SERPs is a much different cat than local.
Local SEO
The factors required to impact authority and visibility on a local SEO level involve, to name a few, proximity to centroid, correct (consistent) NAP citations for the business, positive reviews for the business, adoption of localized Title tags and copy, as well as many others.
However, those elements are guidelines, and the level of work and commitment needed to rank in each vertical at a local level is entirely dependent on the level of competition within that vertical X the the competitive level of the city.
National SEO
In order to rank nationally for targeted terms, you will need to build much higher overall authority, which loosely translates to higher quality (and potentially volume) of links. You rank well on a National level when Google believes that you are one of the best choices for the consumer Nationally. This means your audience is National, and therefore your link strategy, content strategy, social media strategy and mind map should all be geared at looking locally and sprinkling at various major cities - starting with your core city.
I wouldn't suggest removing or altering your G+ account or any of your local settings. You are simply telling Google that on a local stage you wish to be known for servicing businesses in a radius, or the consumers visit your business.
This account gives you stability and authority, acting as an anchor and proof / verification of a physical business. This is an important step in ALL SEO nowadays.
Focus on increasing the scope of your strategy, including your National targets, re-target your Title tags to reflect National cities / shorter tail, and work hard, and you should be able to impact that market.
Hope this helps,
Todd