Category: Technical SEO Issues
Discuss site health, structure, and other technical SEO issues.
-
Migrating to New site keywords question
How about focusing on the keywords and products that make you money? My husband sells kits for model warships that shoot and sink each other. A high-volume keyword would be battleship, but that doesn't get us any sales. A low volume keyword would be uss iowa 1:144 scale turret cover, but that's a low-priced item that's not going to help us too much. For us, a different keyword makes much more sense.
| KeriMorgret0 -
Ecommerce website with too many links on page
How big are these mega menus? If you have thousands of links then you better cut 'em back. If you have a couple hundred on a powerful site that has great rankings and spider flow then I would not worry about it. I have seen those warnings for my site. According to them my site has way to many links but I am happy with it and the site does great.
| EGOL1 -
How to remove Author Link from Post of Wordpress Website?
that small snippet of text isn't likely to be the cause of your duplication issues in all honesty.
| SEOAndy0 -
Wordpress Categories and Over-Optimization Question
Never tried exactly what you are asking but with wordpress website i usually apply simple rules that are best from user as well as from SEO perspective... Never Confuse! Let’s not get your visitor confused when visiting your website or else you will witness a good amount of bounce rate increase on your website. If you think adding a category will make the side bar more confusing then try to avoid it because this will impact your SEO as well as user reaction on the website is one of the important SEO factor. 2. Over Optimization I have witness quite a few examples of over optimization and honestly speaking over optimization is simply addition of similar content and keywords again and again on the page that does not make any sense but if adding a word around 4 to 5 times and its making sense with the content then i think it won’t be counter as over optimization of keywords on the page... 3. Always think about Visitors Think for visitors and search engine will follow you! This sounds ironic but true as visitors are the end point and as search engine is changing on frequent basis it is more bent towards how users (visitors) think and react to different designs, websites, niches and how much time it spend on each page. I think not with this particular problem but with very on-page problem you should focus more on users (visitors) instead of search engine bots only.
| MoosaHemani0 -
Is duplicate content ok if its on LinkedIn?
Hi Stephanie- If the copy is just basic 'About the Company' blurbs or paragraphs, I would be fine with it. If LinkedIn has duplicate versions of entire blog posts that have potential to get shares, links and rank, I would remove them from LinkedIn. Do some simple search queries for phrases from the content. Are you being outranked by LinkedIn? How much copy is duplicated between the two pages sites and how important are these pages potential acquisition of organic traffic are the two main questions I would be thinking about.
| anthonydnelson0 -
After I 301 redirect duplicate pages to my rel=canonical page, do I need to add any tags or code to the non canonical pages?
If I am understanding you correctly, you shouldn't need to rel=canonical anything in this case if you are 301 redirecting everything. Typically you will use the rel=canonical if you are unable to implement a 301 or if a page can be accessed multiple ways, but has a unique URL. Implementing the 301 will direct the link juice from the duplicate pages to the "original" page. Also, just a note if you don't know already, IIS actually has a setting that enforces lowercase URLs and will automatically generate a redirect rule to implement them. Hope this helps. Mike
| Mike.Goracke0 -
Pages Linking to Sites that Return 404 Error
Thanks. I will check them out. I really appreciated your help on this! Best, Ricarda
| jsillay0 -
Site Blacklisted
That is good to hear, Jo. Thanks for letting us know. feedback is good. Be vigilant, because the hackers never stop. My dedicated server constantly has hackers trying to break in, mostly chinese and russians. Complex passwords and countermeasures keep us safe, but it only takes one weak link somewhere to break it all down.
| loopyal0 -
Sitemap all of a sudden only indexing 2 out of 5000+ pages
Also, I would be glad to send a link to our sitemap via pm if that helps.
| rock220 -
Starting a Blog and URL Structure Advice
Yeah, it is awesome for really dynamic URLs and pretty cool how you can completely make up directory structure to match your navigation or help with usability.
| Mike.Goracke0 -
Content Duplication and Canonical Tag settings
Hi Kristi, I would definitely recommend watching the Webmaster video by Matt Cutts where he explains a bit how Google determines who is the original source. The video could be found here. In summary he explains that there are several ways to determine for Google who's been the first writer of the article. Hope this helps shine a little light on this!
| Martijn_Scheijbeler0 -
Are links in menus to external sites bad for SEO?
Its not bad at all in my opinion, especially if its a sub domain. All its going to do is pass on SEO value to your root domain. If you'd like to keep the link juice on your blog intact, you could make the menu links no-follow. Greg
| AndreVanKets0 -
Pages removed from Google index?
Hi, Thanks for looking at this - I am submitting this - http://www.tomlondonmagic.com/sitemap.xml as my sitemap. It shows 2000+ submitted and only 153 indexed... I don't understand why they have just been removed with not message or reasons? However these pages are all similar however targeted for different locations in the UK. Could this be why?
| TomLondon0 -
Do I need a link to my sitemap?
To follow up on Andy's point about size - you can submit multiple XML sitemaps to webmaster tools. When I've used multiple ones before for a large ecommerce site, I believed it helped with the indexing.
| TomRayner0 -
Https vs http sitemap
shouldn't make a huge difference as long as you have one-site map (in this case) and the urls are all the same - i.e. all http or https addresses you should have no issues. though your last line makes me ask if you have two sitemaps, in which case ... why? and do they link to your site in one as http and the other https ? in which case yes get rid of one in favour of the url you want to use.
| SEOAndy0 -
Best Google Practice for Hacked SIte: Shift Servers/IP or Disavow?
Few things... make sure you have a sitemap that is always upto date and submitted to search engines - this will encourage them to view your content first and recognise it as belonging to your domain. In addition to this put links in your content to other parts of your site, if it gets scraped it will probably be with the links in it and so anyone actually wanting real content can get through. If there are thousands from the same domain coming to your site, disavow the base url and also report that url for spam (it's your copyright). In fact if you notice a small site scraping you, do that after you've tried to contact them. If this still doesn't stop them look at your logs and see where their crawlers are coming from and block their IP's. On one of my old site I blocked the whole of China at one point because it was constantly being barraged by scrapers and people trying to guess account passwords. Hope that helps
| SEOAndy0 -
Linking without loosing link equity.
As @keri has said, it would be a little odd to do it to all external or indeed internal links. (the later of which would be better suited to having no-index tags on them rather than being no-followed). In terms of your question, it maybe better phrased as "how can i keep all the link juice and never let it go" ... many people have that question, but it's not a good idea to try and keep link juice / equity. If you did as you suggest and have all links externally no followed then search engines will see it as some kind of manipulation and if done on a large scale will ignore that command and potentially others across your website. Search engines, like Google, don't look very nicely on manipulation and beyond ignoring your commands you may end up with a penalty. Search engines look for rounded websites which are both linked to and from, such that they are part of the community and not a silo.
| SEOAndy0 -
What might make Bing.bot find a URL that looks like this on our site?
Hi Streamline, I thought I would circle back and update everyone as to what I found. You were correct about mal-formed URLs being the culprit of this problem. We have many isolated incidences of URLs for internal links that are missing the "/" at the beginning of a relative URL. There are inconsistencies on the relative URLs all over the site. It's certainly an example of one of many problems that can be caused by using relative rather than absolute URLs. Since we are in the process of completely re-doing the site and moving to a new platform, it's something we can definitely work to get right during the transition. Thanks again to you, Daniel and Keri for jumping in with answers. Dana
| danatanseo0