Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Web Design

Talk through the latest in web design and development trends.


  • Thanks, yes, I agree it depends on my audience... and the average age of the web visitors is one of the components that should be considered.  The older visitors would probably still like the "home" button; the younger ones probably know that the logo is the home page link.

    | FindLaw
    0

  • I am very big on WordPress, after reading the title I was hoping I would be able to just hand you an answer after I read the full details of your requirements... But as it turns out, in all my years of playing around with WordPress I am yet to see a plugin that will optimize your tag pages that way. My suggestion would be that you hire a decent WordPress dev from codecanyon to help you with a custom plugin for this. Other than that, this plugin is not currently available on the market in either paid or open source version. You have to make your own. PS- I am guessing you saw something similar on a bigger soccer website somewhere? Goodluck getting your plugin!

    | LayGiri
    1

  • Go through everything on your website navigation, content, XML sitemap and find links to those URLs.  Make sure that your internal links are all updated in addition to having the 301s in place. Lets say all the old URLs are in the folder /stuff/.  You can setup a spider like Screaming Frog to spider your current site and let you know all the pages that link to internal URLs with /stuff/ in them using the Custom Search Feature.  This will let you know all the pages internally that you need to update the links on.  You can also generate a list of all the /stuff/ pages you link to internally for testing later. Once you make updates to your site with the links and 301s you can the use the spider to check things 2 ways.  Ideally you would first do this on a development server, test and then go live and test again once you are live. Have the spider go through your site (spider mode) and your XML sitemap and make sure there are no links to /stuff/ and/or that it finds no internal 301s. Have the spider go through the list of old /stuff/ URLs (list mode) and make sure they all 301 to the correct page. You could go a step further and use OSE (Majestic, ahrefs, etc) or the data from Google Search Console to find external sites that link to your old /stuff/ pages and do two things. 1)  If the link is from an authoritative site ask them to update the link. 2) Cross check all the links to /stuff/ pages to see if there were any that you missed in your internal audit to make sure that those 301 to the correct page. This all assumes that you are doing a 1 to 1 redirect from your old pages to new pages, i.e. you are keeping the content all the same on the old and the new pages and just updating the URL. If you have any old content that may not have links or are of low quality you may want to consider a content audit and let those 404/410.

    | CleverPhD
    0

  • Hello, I came across this question when trying to find a solution for a similar question.  One of my clients has an event coming up where they will have multiple photo galleries.  The images within the photo galleries will have titles and descriptions and one image can potentially show up in more than one photo gallery. In this case, would you recommend implementing canonical tags?  If not, anyone have any other suggestions? Thanks in advance!

    | seoFan21
    0

  • If the content is going to be the same, keep them merged, don't separate. If you have sufficient content to produce truly valuable pages then you can segment them. It really is that simple.

    | rjonesx. 0
    0

  • Bob I love two sites .. ... airbnb is my favorite site.. https://www.airbnb.com.au/ The second, for layout and ideas is:- http://www.squarespace.com Hope that assists. ps do not like squarespace as a platform and not on airbnb.... !

    | ClaytonJ
    1

  • Hi J.P., Page depth refers to the number of clicks it takes a user (or search engine) to get to a page from the home page. And yes, this can negatively impact SEO, as well as usability, if it takes too many clicks for users and search engines to get to a page. (Search engines may use up their crawl budget, while users may simply leave your site and go to one that is easier to find what the are looking for.) However, the number of clicks it takes to navigate to a page doesn't necessarily mirror the URL structure. So, in the example you give, the existing URL makes sense - http://www.domain.com/procedures/breasts/augmentation. It is readable, includes keywords, does not use special characters, avoid stop words, and isn't too long. If you were going to change the URL structure, I would go with http://www.domain.com/procedures/breastaugmentation over http://www.domain.com/breastaugmentation, unless the only procedure your client does is breast augmentation. But, seeing as there is not a compelling reason to change the URL structure, I would leave it as-is. Even if you perfectly plan and execute all 301-redirects and update every link, you are creating a lot of work for yourself (as well as anyone who links to you, as best practice would be to ask them to update the new link). Furthermore, 301-redirects are known to pass slightly less value over time. Of immediate concern, you should expect fluctuations in the site's performance in the SERPs when making any sitewide change such as this. For more information on best practices for URL structure, I recommend checking out this post by Rand Fishkin: <a>https://moz.com/blog/15-seo-best-practices-for-structuring-urls</a>. I hope that helps! Christy

    | Christy-Correll
    0

  • The way you structure the url has no impact on SEO performance (unless the url becomes very long & unreadable). The depth of a site has an impact (how many clicks do I need from the homepage to get to a page) but this is completely unrelated to the folder structure. A good article on why it's good to have a certain structure in your url can be found here: http://www.bruceclay.com/blog/structured-urls/ As changing URL's always carries a risk & doesn't bring SEO benefit I wouldn't change them just for the sake of changing them. Dirk

    | DirkC
    0

  • I am confident that the pages in your persistent navigation are given more weight by Google. On each of my sites I have a sidebar that presents a categorized list of links to the important pages that I want everyone to see.  They could be there because I want them to be given attention by google, because I want them to have an opportunity to be seen by every visitor, because they are new and I want to promote them, or because they are important sales pages.  The strategies and categories are different for every site. If someone lands on your site and sees three links to other pages, what message does that send?   Nuthin' here?   If someone lands on your site and sees a great presentation of inviting topics, what message does that send.  If you want people to see what you have, don't make them dig for it.   If you have great content, get it out there and flaunt it. Some of my good friends are designers who proselytize the concept of minimalist design.  They think that I should have no more than three to five links on any page. They "tisk tisk" when they look at a site like mine.  But, each of my sites has more traffic than all of theirs combined.

    | EGOL
    0

  • You certainly don't need to include every page of your website in your top navigation menu. Your plan of having a Locations page that then links to each of your location pages individually is a fine way to go. That said, the deeper into your site architecture your page is, the fewer ways there are for people and search engines alike to discover it - to your point, there is now only one page on your site linking to all of these location pages. One reason internal linking outside the navigation is important is that it provides additional ways for users and search engines to browse to your content. I would recommend taking a look at the pages on your site and thinking about what pages a user might want to visit next, and linking to those. Providing an intuitive next step for your users keeps them engaged, and provides additional ways for your content to get discovered.

    | RuthBurrReedy
    0

  • Adding to this, and sorry if there is an obvious answer - I just want to make sure, I should create a new tag for beta site that is blocked, yes?

    | Nobody1596916721222
    0

  • Hey guys, big thanks for everyone chiming in here. We did do the fetch as Google and don't have any rel canonical and we didn't move to any other domain (even kept the same IP)  We've dipped down -8% in Google, which is very expected with such a large switchover but the 25-% drop in direct traffic is what has been hard to figure out. At first I thought it might search traffic masquerading as direct (as several articles have noted) but that is only down -8%. It might be technical of nature, perhaps the analytics code isn't firing correctly somewhere. I was just curious what you all had experienced with a major re-launch... Up until this point the site was last overhauled in 2007.

    | ScottOlson
    0

  • Thanks Keri

    | edward-may
    2

  • Thanks Nitin and Moosa! I appreciate your replies.

    | Eric_haney
    0

  • HI Dino, I don't see any issues. It is okay to use multiple H1 tags for reasons such as this. Google has confirmed multiple H1 tags are okay. My example above was probably more alarming to you then I could have realized. My effort was to point out a simple case of how to use css for multiple device types. In your case having different text is for the benefit of the user which is exactly as it should be. Good job, Don

    | donford
    0

  • No problem at all -Andy

    | Andy.Drinkwater
    1

  • That's what I was thinking too.  It may only be a few weeks but that will give me some idea what to keep. I just didn't know if that mattered and removing those pages would have a negative effect.  But since it's on a separate domain, it guess not. Thanks!

    | codyfrew
    0

  • this does a pretty good job of explaining lazy load http://www.thesempost.com/lazy-loading-images-likely-will-indexed-google/

    | Vacatia_SEO
    0

  • Hey Trenton Do the pages in fact return a 404 code now? You can check with http://urivalet.com/ set to Googlebot. Are they indexed in Google? search for the URL and put 'site:' before it. If they 404 and are indexed, it will just take time for them to drop out. Google continues to crawl pages they had once discovered, but are not linked to anymore, and these will definitely show up in your crawl errors. Pages with crawl errors are actually a good thing if that's what you expect and intended which in this case, it was I know it stinks to have errors showing up in the report, when in fact they are not really errors you have to "fix", but just think of it more like a report, and some pages it's perfectly OK to have 404'ing.

    | evolvingSEO
    0

  • Another couple notes: URLs are ugly on their demo site at http://lesbonneschoses.prismic.me/blog/ Example post: http://lesbonneschoses.prismic.me/blog/UlfoxUnM0wkXYXbi/one-day-in-the-life-of-a-les-bonnes-choses-pastry (Post ID subfolder needs to go) Example category: http://lesbonneschoses.prismic.me/blog?category=Announcements (would prefer to see these parameters as subfolders like /blog/category/announcements/ and also keep them lower-cased.) Looks like the /UlfoxUnM0wkXYXbi/ subfolder can be removed - see "Link Resolver" on this page https://developers.prismic.io/documentation/developers-manual. Here's some of their notes on caching: https://developers.prismic.io/documentation/developers-manual#cache.

    | KaneJamison
    0