Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Sounds like your index page is most helpful for users to see all your videos than for Google and other search engines. I'd either figure out a way for your index page to be less duplicate content or just de-index. I'd imagine that most searches which land a user to your site are going to be for specific videos, not general videos made by your brand. The latter group of folks will likely land on your homepage.

    | EricaMcGillivray
    0

  • Shouldn't cause an issue either way. Moving from http to https is one of those edge cases. But basically, it's a login page so link equity really isn't a factor (most of those links are likely to be ignored or devalued anyway) and it's not really a reciprocal link. My advice - don't sweat it.

    | Cyrus-Shepard
    0

  • I would look at before and after dates in analytics and compare to see where your traffic loss is occurring. Have you been manually tracking the rank of your keywords over time?  have you noticed a drop off for a keyword that you would expect to be a high traffic driver? It could be that you have lost a referring link that was sending over a lot of traffic, was/are you doing any PPC traffic that may have had an affect on your volumes, sometimes people forget that PPC spend also drives organic traffic, so you would see a natural decline in organic traffic if you slow down your PPC spend.

    | IPIM
    2

  • Thank you for your answer. How would you envision a sitemap for show listings functioning?

    | TheaterMania
    0

  • Well, I know Bing says they support it, but how well it works, I don't know. You can read about it here on the Bing blog post, and Tom from Distilled answered a similar question on Moz here. Hopefully these will help a little. -Andy

    | Andy.Drinkwater
    0

  • if you'd like to instruct Google not to crawl the URLs with that parameter, select 'No URLs' in the Google Webmaster Tools > URL Parameters section. Still, this is a guide for Google to follow and they do not always follow the rules set. Going the robots.txt method ensures that Google will not crawl those URLs.

    | Ray-pp
    0

  • Sorry for the delay rpaiva, yes it would be two URL's with unique content.

    | Townpages
    0

  • Hi There Generally those types of 404's won't be too harmful - they sound like they may have been somewhat artificial WordPress pages. What I would do is get your list now from Analytics or Webmaster Tools - this way you will capture URLs that actually got traffic or Impression in Google and redirect those. So run a landing pages report, and an top pages report in webmaster tools - maybe for the last 6 months. Create a text file of all the URLs, and run them in list mode through Screaming Frog. Redirect any that 404. If you were to go back in time, what I would have done with Screaming Frog is - let it crawl everything - you have to allow it to "follow redirects" and "ignore robots.txt" etc - I know Google is not supposed to crawl anything in robots.txt - but basically you'd be letting Screaming Frog get to everything, that way you don't miss any URLs.

    | evolvingSEO
    0

  • Hi There There certainly looks like there's an issue with links. Here's a few resources on cleaning them up; http://searchengineland.com/five-steps-to-clean-up-your-links-like-a-techie-166888 http://moz.com/blog/google-webmaster-tools-just-got-a-lot-more-important-for-link-discovery-and-cleanup http://www.greenlaneseo.com/blog/2014/01/step-by-step-disavow-process/ http://savvypanda.com/blog/guide-how-to-use-google-disavow-tool.html Essentially you need to remove and/or disavow as many bad links as you can. I'm also going to guess that you could use a design upgrade. I don't have any data to back this up, but it's just my opinion looking at the site. A poor design could also bring Panda into play. I've seen some sites get a bump after a design upgrade.

    | evolvingSEO
    0

  • Hi! Even if it's true that Google determines the importance to give to content - especially internal and external links - depending in what section of a web page it is (header, sidebars, footer, main body), it is quite a myth thinking that "additional" content explaining, for instance, the league table, should be put before for obtaining a better impact. Moreover: is it the table the most important element of your content, the one your users are interested the most? Then burying it under a "text" explaining it would be offering your users a poor experience. Topic modeling should be not perverted into a something needed just because so "Google" will make the page rank better. The consequences will inevitably be horrible, as creating "SEO texts" with almost no value for the user, and even dangerous, because Google could start applying spam filters as keyword stuffing et al. Write naturally, as if you were describing something to someone speaking. If during the description make sense using synonyms and/or introducing related concepts (i.e.: electric guitars in a page about acoustic ones), then go for it. If it doesn't make sense, then don't it. Finally, if with topic modeling you mean "LSI"... then know that LSI is another myth.

    | gfiorelli1
    1

  • Hi There, First of all, let me clear one point- **Google supports frames and iframes but to the extent that it can. **this means, Google does not support frames and iFrames completely. However, Google is often capable of crawling iFrames on webpages if the iFrames are SEO-friendly. In fact, Google is capable of passing link juice via iFrames (read more about it here- https://www.seroundtable.com/google-iframe-link-14558.html). In my personal opinion, _**avoid ****using iFrames if you can, **_because they are not 100 percent SEs friendly. If it is required then you should take the effort to make iFrames as Google-Friendly as you can. Here are few tips for that: -You can index your iFrame trough robots.txt use Google web master tools to add the URL of the iframe source to be crawled. However, this will only index the iFrame and not the parent HTML document. So it won't score high on the Search Engine Result page. -if you have frames on the master page of your website, to ensure Google crawls the entire website, ensure that you include links to other pages within the website inside the no frames area of the master page. - ensure your main or surrounding page highlights more when it comes to search engines. It’s best not to make your main page simply a placeholder for one iFrame or even multiple iFrames. -Try to move as much info as possible from iFrames to the main page. Overall, your main page should at least describe the content within the iFrames. _- I_f you want the iframe to be indexed by google, include a link from the parent page to the iframe page. Add meta tags to the master page. However, meta tags are only a partial solution, because not all the search engines support them. Most search engine spiders will only see the master page. Just like an old browser, they don't understand the instructions on how to produce the frame layout. These are ignored, and only information within the noframes tags is read. so use noframe tags also. Apart from this, I would recommend you to read the following article as well- http://searchenginewatch.com/article/2064573/Search-Engines-and-Frames Hope this will help!

    | sachin-sv
    0

  • I'm pretty sure it was on the product page you messaged about. Not that there is anything wrong with marking up several things on a page, but I'm just saying perhaps one of those other areas (e.g. pharmacy) is where the incomplete markup is instead of the product markup.

    | Everett
    0

  • I wouldn't worry about the location of the IP too much. What's most important is that you're getting quality, value-added links to your website for your niche. This is conjecture on my part, but I say it would be more important to receive a link from a .es domain than from a website with an IP in Spain (for specifically ranking in the .es SERPs). However, unless the terms you're trying to rank for are very competitive, I would concentrate on just receiving quality links, regardless of location. I'm not sure that there would be that much incremental benefit from the IP address location.

    | Ray-pp
    0

  • Hi vijayvasu, Maintaining a healthy link profile is necessary; and being proactive, like your question implies, is a great skill to exercise. Start by using Moz's link analysis tool. Export the links to an excel, remove duplicate domains, then identify from that list the domains that may be spammy (low DA/PA). You'll need to visit the site to know how spammy it is and whether or not your should proactively shun that link. Use Google's disavow tool to remove them from your link profile.

    | Ray-pp
    0

  • What has worked for my clients in the past, is to have two domains: (1) the main domain for SEO purposes, controlled in Google Webmaster tools as the primary domain (2) a shorter marketing domain, that forwards (with masking) to the main domain.. and passes the juice. This way the marketing domain can be shorter, advertised in newspapers,business cards, etc... and can be easily remembered.... and main SEO domain still gets the value of the direct hit traffic.

    | Laurean
    0

  • Hey John If you decide to take action action, then being aggressive with the links is a good approach. Both in Cyrus Shephard's great Moz blog post on the disavow tool and also advice from Google itself says if you suspect an entire domain to be spammy, go ahead and disavow all of it. However, from my own perspective, I would only go through and create a disavow file if I knew for sure that I was suffering from a manual or algorithmic penalty.  I have seen very little benefit in being proactive with that tool (eg rankings are good, you spot bad links in your link profile and disavow them to be safe) and, in fact, I have seen a number of cases when a disavow was submitted "prematurely" - ie, a site was ranking fine and then disavowed some links and saw rankings fall. If we want to look at it from a slightly skeptical point of view - if you're not suffering from a Google penalty, do you really want to inform Google that you have suspicious links in your profile? However, that is a matter of preference based on my own experience.  I would certainly take note of the links you think are bad (and perhaps put together a file ready to go, just in case). Worth noting that prweb.com has made all of its links nofollow anyway, and so as they're not passing on link equity it doesn't seem logical to then disavow them (as they have no SEO benefit) Also, keep in mind though that if you visit the page and the link is not there - and especially if you do a google search for cache:http://www.example.com and see that the cached version contains no link - there's a very good chance that the link has already been discounted anyway and so would not be flagged in a manual or algorithmic check.  Seeing as you have so many links from the domains, that may be occurring. Hope this helps

    | TomRayner
    0

  • Hi, This is completely normal at the moment. Many 301 URLs stay in the index for 6-12 months. Case in point, google this: site:seomoz.org There isn't anything you can do. Verify your 301s are set-up correctly. Move on.

    | anthonydnelson
    0

  • Judging from your original comment, it sounds like you know what you are doing. Just give it some time. Sometimes, I find that a FAQ or something similar will rank over a more Category based page because despite being less targeted, the FAQ is full of content and the category page is quite thin in comparison. Here is what I would do: Update the Loft Conversions Essex page to include more content. Better content. Build a few external links to that page to strengthen the authority. Give it some time.

    | anthonydnelson
    0

  • Thank you.  Just signed up for a month.

    | Shawn124
    0