Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • Depending on the plan you have, the spider crawlers 10k - 20k pages per campaign

    | OlegKorneitchouk
    0

  • Hi Carrol, As Robert has put so well...if it were only that easy:) Seeing rankings go up and then down for new pages (or newly optimized ones) is a pretty common phenomenon, with their freshness giving them a little boost that then dies away.Remember, your SEO will never be a set-and-forget effort. Now that you've done your first work and are seeing confirmation of doing this well by your A+ rating in our tools, take a little moment to celebrate how much you've accomplished and then get ready for your next steps. These may include, but won't be limited to:- Studying your analytics over the coming months to see how users are interacting with your site. This may lead to improvement you can make in both SEO and human usability. How people use your site affects your rankings.- Creating your ongoing content strategy. You may be blogging, creating whitepapers, infographics or other cool stuff to begin to draw steady, valuable, traffic to your site. Having a lively site affects your rankings.- Analyzing your competitors. While the age of your site vs. that of competitors isn't something you can control, you can learn a lot from what your competitors are doing and then figure out how to do it better. Remember, no search engine ranking exists in a vacuum. All the while you are struggling to get to the top, your competitors are doing the same, but if you can make a stronger effort than others who are being neglectful, you can hope to surpass them.- Creating your social media strategy. Find out where your customers are and then figure out how you can participate in and add value to the places they hang out most.- On-going linkbuilding. In competitive markets, this may be a never-ending task.While the effects of some SEO can be seen pretty quickly (within hours or a few days), most efforts are going to take some time to really settle in. A week is definitely not long enough to judge how Google will ultimately respond to the efforts you have made so far, but that's okay, because your efforts are just beginning. Being #18 isn't a bad place to start with a newly optimized website if you are in a competitive market, Carrol, but it does indicate that there will be work ahead of you. By making a steady, consistent effort to continue to grow and improve in as many areas as you can, you will be continuously feeding the bots data that says, "I'm here, I'm active and I'm doing everything I can to be a great resource for visitors." Hang in there. Don't be nervous. Expect this to take time and you'll be approaching the game with the right attitude. Sincerely wishing you good luck!

    | MiriamEllis
    0

  • Hey Robert, thank you for your valued input. They contact because my news is before the brand announces it in most cases, and they know that.. so I reckon this one is a good one too, I hope.. Thanks again bro

    | smoki666
    0

  • start by using the moz campaign tool.  follow the directions, enter your keywords and keyword phrases that you are trying to rank.  Then let it crawl your site.  Wait for the report, then carefully complete the task one by one.  It's is a very time consuming, daunting task that takes hard work.  In the end, all you are doing is building a better site for the search engines to crawl and rank you. Just go step by step.  Take the long road, no short cuts.  If someone tells you they can fix it in a day- they are full of crap. Building a online business is harder than running a brick & mortar any day of the week. start with the basics- read a lot and listen to people that sound reasonable- don't listen to snake oil salesmen.... Email me anytime charris@thegardengates.com- we are a real business doing this all in-house after going down the consultant road( trying to take short cuts) we spend a lot of time on this- 7 days a week. Chad

    | CHADHARRIS
    0

  • Swiftseo I, personally, would not have two sites with varying domains that are near identical. No matter what the reason, I think there is more negative than positive. I can name multiple reasons for why using this for testing is flawed, first and foremost.  I would pass on the canonical and instead use a 301 redirect of "testing site" to main site. That way, if there is anything in the way of link juice to pass it gets done. If you need help with how to on 301, there are various places to go or place a question in Q&A. The main thing you are hurting with this methodology is yourself. You have two sites competing against each other. hope this helps,

    | RobertFisher
    0

  • Wayne, I hate to say it, but it is because like life, Google is not fair... Not that it is their fault, you can only do so much even if you are Google.  When I went to the site in OSE I was surprised to see what they are getting away with. I cannot suggest to you what to do, but if they were my client I would work on making sure this in some way changes. (Hope I haven't said anything that offends anyone engaged in this type tactic). Best, Edit: One question I would have is are they doing the same in PPC? If so, then you really have an actionable issue given the sites are all the same.

    | RobertFisher
    0

  • Hi Nick Thanks a lot for advice. Regards Nikos

    | nyanainc
    0

  • I love Ray's work I would honestly not worry too much about the too many page-links. The only issue you will run into is that search engines will not crawl all your content. This you can fix with sitemaps to hint what should be crawled. You can also hint certain links with "noindex/nofollow" based on what you would like to hint to be indexed/crawled. The duplicate titles could be fixed by making the title unqiue - either by appending a pageNo or something unique about the page (a username, a date, a title with a category etc). The canonicals make sense where you have cases where different URLs result in the same page being displayed.

    | MagicDude4Eva
    0

  • Endorsing Jared for the full thread/follow-up. Unfortunately, when it comes to indexing all of these pages, you can't really have your cake and eat it too in 2012. These pages do look thin to Google - honestly, when the results don't change (and I get that that's just because the filters don't always impact the search), then it starts to look like you're just spinning out duplicates to target new keywords in the header. At high volume, that could get you into trouble (and is the kind of thing Panda has targeted). You're right, though, if you canonical these pages, they won't get indexed and ranked. These days, my gut reaction is that the trade-off is worth it. If you focus your ranking power, the core category/neighborhood/etc. pages will get more authority, you'll reduce the risks of thin content, and you'll land search users on core pages that they can use to navigate to the options they want. There's no solution that doesn't involve a trade-off, but I think focusing your index would be a positive trade-off. Keep in mind, too, that Google isn't really that fond of search pages - ultimately, you want them indexing the core property listings. The key is to have clear paths to those listings and to index and ranking prominent category pages. If you try to rank for every variations of ever search/sort/etc., you'll just end up diluting your ranking ability in most cases.

    | Dr-Pete
    0

  • Canonical tags should drastically help with this. The % is being generated because the URL is being encoded and has a "(" in it. Have your product page each contain their own canonical with the URL you want indexed.  Not sure which URL to use? Check your internal links and see how your site is linking to your product pages. Presumably its: http://www.company.com/ProductX-(-etc/ or http://www.company.com/ProductX-(-etc Add this URL as your canonical and the SE's will understand what page is the 'real' page. This will solve both problems from an SEO standpoint. If you want to actually stop the site from doing this, you can remove trailing slashes and encoding using HTACCESS.

    | JaredMumford
    0

  • Awesome thanks. I think we have a few issues like that so I'll have to have them do a sweep and clean it up.

    | KateGMaker
    0

  • It could be there hasn't been a PR update since the pages received their last grade. And like CMC-SD said... Low PR doesn't necessarily mean anything negative. If you have a good trafficked page and it's converting - forget about chasing PR. I've had sites that have been PR3 and PR4 for years and drive a ton of high-value, converting traffic from the SERPs and rank for all kinds of fat head, competitive phrases.

    | essdee
    0
  • This topic is deleted!

    | essdee
    0

  • Happy to hear all is well. Good job on the micro data. More should use it. Robert

    | RobertFisher
    0

  • Hi Carrol, It can take a little while for Google to update this and show your Google+ image next to your listings. When I did it, it took about 2 weeks before my profile image started showing next to my webpages. I wouldn't worry, just hold tight and over the space of the next couple of weeks it will happen Matt.

    | MatthewBarby
    0

  • My mistake - I thought that was the code that enbphotos wanted to add to the .htaccess, however if that is the code it is pretty obvious that the rewrite engine is engaged Alan haha cheers. My apologies for misreading the response. Other than that I agree with what Alan says...

    | Matt-Williamson
    0

  • cool, thank for your backup on this. I figured that the bot-redirect described would be a bit over the top. And great to know that having a http-homepage on an otherwise https domain is a non-issue. much appreciated!

    | zeepartner
    0

  • Hi Joshua - If someone is linking to the www version then it doesnt pass as much juice as it would if it wasnt redirected (theres lots of info on this on the internet with varied options). Overall, most SEO's agree that an inbound link that points directly to a page without being 301 redirected has more of a positive SEO effect. With that being said, in your case Google Webmaster Tools may be detecting this double redirect error simply because there is an external website somewhere linking to the 'www' version. You can find this using OSE or using the WMT by going to CRAWL ERRORS and looking for the sunny-isles url. Clicking on it (if its there) will show who is linking to you and from where. BTW - when did you do the redirects, and how long since you noticed the new url wasnt indexed (and was the old URL indexed?)

    | JaredMumford
    0
  • This topic is deleted!

    0