Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • I would check with GoDaddy.  If you do a whois lookup they're managing all the DNS for that website so likely the ones managing the redirect if it's not with their webhost or htaccess. JIm

    | Brafton1
    0

  • Thanks guys, I have fixed the problem. and got the Root Domain back on the search results. The issue was over optimization and unnatural anchor text link profile. We really appreciate the community for all the response.

    | EVERWORLD.ENTERTAIMENT
    0

  • Got it - thanks!  i am optimistic that the duplicate pages I fixed will make a big difference and I will be able to tell, but well see! ... Neil

    | trophycentraltrophiesandawards
    0

  • Andrew thank you for your response.  Our next hurdle is that they are wanting to leave the .net up while the .com is also up.  Two different designs, similar if not the same content on every page.  I'm pushing for internal IP only testing then launch in the middle of the night so we can roll back if there are issues.  I mean with thorough testing there shouldn't be but I know anything is possible.

    | Sika22
    0

  • Hi Jenny jStrong is correct. Subscript (<sub>) and superscript (<sup>) tags are not going to have any effect (positive or negative) on SEO.</sup></sub>

    | TranslateMediaLtd
    0

  • Thanks Phil. I had tried seaching for indexing figures and those numbers are relatively recent which is positive. I'll give another few weeks before becoming too concerned. Regards

    | Morrreau
    0

  • It always amazes me when clients ask for stuff when they don't even know why they would need it. I'm not sure it's even possible to link them, and even if it was....I don't see what value it could possibly provide without all the systems that the tags are intended allowing for integration between them. The value of linking Webmaster Tools and Analytics is obvious. There's stuff happening off-site that you might want to know about (inbound links, impressions on search, etc) and have integrated into your analytics report...fair enough. The same is not true of Google Tag Manager, which simply allows you to manage the various tags easily. These tags are normally related to allowing other systems to do stuff (such as collect data). It is these systems that need to be integrated, if possible (e.g. Analytics and Adwords).

    | TranslateMediaLtd
    0

  • It is unlikely that having links from MyBlogGuest would cause a drop in indexed pages like that. Where are you seeing this drop in indexed pages? Is it being reported in Moz or Google Webmaster tools? Also, do you have Google Analytics set up for your site to check other metrics? A large drop in indexed pages does not necessarily mean something wrong (canonical tags, cleaning up duplicate content, reporting errors, noindex tags, etc. can all cause a drop in indexed pages).

    | MikeRoberts
    0

  • I have seen this happen with sites that have been redirected. For example, I redirected site A to site B once - including all of the content from site A - and site B inherited the Titles and descriptions from site A for several months. It was very irritating because I didn't use the words anywhere on the page, and I was using NOODP and NOYDIR tags. I even told Google the site had moved in GWT. Eventually it sorted itself out, but it made me see how difficult it must be fore Google to keep everything straight on so many sites with so many different site elements, especially when redirects are involved. Both sites had content and were in the the same niche. Site A had stronger signals, and was a year older than site B. However, I wanted to redirect to site B for branding reasons. I suggest you do some digging around to find out if any other site has been 301 redirected to your site. If so, did that content ever appear on it? Good luck and let us know what you find out!

    | Everett
    0

  • Hi Irving, As Yusuf suggested, .htaccess files are not publicly viewable - they're inaccessible by public users by default. For your own site's .htaccess file, simply downloading the file via FTP and opening with Notepad/similar will work. But I take it you were looking for a tool to check the .htaccess file for public sites. If there is such a tool, it's news to me. Best, Mike

    | MikeTek
    0

  • My suggestion would be to go beyond creating 'yearly' top lists for the site (these are old and tired). Look to create an 'Evergreen' content page that you can use and leverage year over year, build on and create a community and discussion around. Discuss the changes each year by revamping the list, ask people their input (UGC) and discuss why some of the one's that fell, did, while also pointing out new one's didn't fall and why By creating a page like this - you leverage the long term effect of a page that never gets old, or outdated (as one does with regards to a specified URL like 2012 or 2014) in your examples. This will also help you create a very strong profile from a backlink perspective as your links will accumulate into 1 evergreen/lasting URL - that never gets outdated with yearly updates you will make. Might want to use the META information for data posted and date expired to ensure that the crawlers know to come back and recrawl when a page is live. Ensure it's mapped and setup properly in the Sitemap XML file too I think the advantages of moving towards this will help your link profile, leverage a great piece of content year over year, making it move 'sharable' from a social media perspective and leverage long-term value. Just my 2 cents to help you out Cheers, Rob

    | RobMay
    0

  • I see it like this, its a bad experience to find that a item is out of stock, the 404 will remove you from the index. A site where you have a lot of products is probably not be ranking well for ever product page anyhow, I would be getting category pages to rank. and not be worried about a few lowly product pages

    | AlanMosley
    0

  • Hi Sally, This could have to do with Google's desire to present the most diverse selection of results for terms, and since there are a range of other websites relevant for the [david lerner] query, Google has dropped the subdomain in favour of keeping the main site in first position, and showing a potentially relevant result for a totally different business in second position (the leggings result). This can happen even when a result has ranked well for years. To solve this, it would be good if the site displayed "sitelinks" for the brand query (i.e. the six or more links that appear below a branded search, usually linking to other key pages within the website). Obtaining these links usually requires strength and authority that far outweighs other companies with similar names. It's not very common to see subdomains appearing as sitelinks, but it does happen sometimes, when a brand and the subdomain are particularly strong, e.g. https://www.google.co.uk/search?q=ycombinator&oq=ycombinator&aqs=chrome.0.69i59j0l5.1467j0j9&sourceid=chrome&espv=210&es_sm=119&ie=UTF-8 (look at the Hacker News and Y Combinator Posthaven links). You have a partial duplication issue with both http://news.davidlerner.com/ and http://news.davidlerner.com/news_index.php displaying the snippets for largely the same stories, but this should not be enough to damage the subdomain from a Panda point of view. You may consider restricting bots' access to the /news_index.php page unless it is performing a different task from an SEO perspective for you. Again, I do not believe this has caused the site to drop out of the rankings (and it is still perfectly indexed, ranking for its URL and full name) but it's something to consider tidying up.

    | JaneCopland
    0

  • Hi Kurt, Multiple canonicals won't work. I believe our own Dr. Pete did some testing on this a couple years ago, and the results didn't pan out. It there isn't original content on the page, you have a couple of options. First of all you could de-index the page by adding a meta NOINDEX tag. A second option would be use the cross-domain canoncical to the most prominent choice (page A or page B) typically whatever is closer to the top of the page. If you have the opportunity to add unique content to the page, this might solve your problem. A couple hundred words could make the content unique and the page might have a chance of ranking on it's own and you wouldn't need to worry about canonicalization. Hope this helps! Best of luck with your SEO.

    | Cyrus-Shepard
    0

  • Here's a replacement post with some of the same information at http://moz.com/help/pro/why-can-t-rogerbot-crawl-my-site. You can also send a note to the help desk at help@moz.com and we'll help you figure out what's up.

    | KeriMorgret
    0

  • Google penalizes when it sees that you are gaming seo or trying to fool google. If you use "no follow" on your links, you are saying google that dont count the link juice therefore you are not claiming seo benefit from these links. NO harm done. Google doesn't penalizes you for any no. of "no follow" because you are not fooling it. You have added it for your users, not for google. Many sites like "Pay pal" have their links below any site which uses its payment system, but as it has "no follow", google doesn't penalize them if also they have millions of such link backs. relax and add "no follow" on these type of links. you will get benefit. for sure. Hope this helps

    | vivekrathore
    0

  • ajax is great for loading content on demand without reloading the page, but for content you want to rank I would not be using it. For dynamic content you want to rank you should load on initial request to server. Using solutions like you are just get very messy, and are more work then doing as I suggest

    | AlanMosley
    0

  • Hi, Definitely don't use disavow unless you think that the links are poor quality and could harm your site, or are actively harming it right now. That is what disavow is for, not for removing your 404 pages. There is no harm waiting for Google to remove the 404 pages on its own, especially if you have used its URL removal tool as well. If there are any good links in the backlink profile of the 404ing pages, do attempt to contact the webmaster and have them changed - most people are more than happy to do this.

    | JaneCopland
    0