Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: On-Page / Site Optimization

Explore on-page optimization and its role in a larger SEO strategy.


  • Always welcome

    | DmitriiK
    0

  • I think it is important here to note something that is a bit of a misnomer in the industry. Page Authority (PA) and Domain Authority (DA) are NOT measures of how much a page or domain will benefit you were you to receive a link from it. They are machine-learned keyword-agnostic predictive metrics for rankings. That is to say, a high DA domain or high PA domain are more likely to rank, but they are not necessarily more likely to help your site rank were you to acquire a link from that site. We know that the most important raw metric still continues to be the number of root linking domains. We also know that nearly all published link graph calculations look at metrics like the number of links per page to divide link value passed from one page to another. Subsequently, the actual value you get from a link can be wildly different than the PA or DA might suggest. I would take a different approach altogether. If you spend even a second considering whether to get a link or not because of its PA or DA, you have wasted that second. While using PA, DA, MozTrust or MozRank to sort opportunities might be useful if automated and asynchronous, you should never stop and sweat whether it is worth going after a link. Of course it is. Get that link. Go get it now. Happy link hunting

    | rjonesx. 0
    0

  • If the URL can be accessed via one page, there is no way Google will be considered as duplicate content. 301 redirects should certainly help avoid duplicate content profiles. But, when it comes to Google search console, it usually takes time to update things under it. I personally like the idea by CelevePhD, adding rel canonical to your finally page will also help you confirming Google that what is the preferred version and after the crawl Google will deindex the alternate URL from the index. Hope this helps!

    | MoosaHemani
    0

  • If it is not done properly it will affect your site greatly. it is important to use a tool like Deep crawl or screaming frog make sure you have a copy of all your URI's then when you go to Create Redirects Use this tool to generate redirects from your old permalink structure to the /%postname%/ permalink structure. Your Redirects Add the following redirect to the the top of your .htaccess file: RedirectMatch 301 ^/home.asp$ http://domain.com/?p=$ Please note that this relies on WordPress to do a second redirect, from the post_id to the post. Remember when you switch platforms expect a drop before it rebounds on Google treat it as if you're changing a domain https://moz.com/community/q/how-to-keep-old-url-juice-during-site-switch https://moz.com/blog/achieving-an-seo-friendly-domain-migration-the-infographic larger version of photo below http://www.aleydasolis.com/images/seo-website-domain-migration.gif https://moz.com/community/q/changing-domains-how-much-link-juice-is-lost-with-301-redirect For WordPress this is an extremely helpful tool https://yoast.com/wp-content/permalink-helper.php https://yoast.com/change-wordpress-permalink-structure/ I hope this helps, Tom JqUSVWt.gif

    | BlueprintMarketing
    0

  • You're very welcome Brittany. I've not personally been involved in these situations, but have seen this play out in a variety of different industries. Good luck and keep at it!

    | Todd_McDonald
    0

  • Yes I was thinking of testing this. I have just checked our images and from what I can see the devs have set the alt tag to default to the product title. Then the image title is a bit more descriptive - does the image title/legend help with anything or should we ignore these and update alt tags instead?

    | BeckyKey
    0

  • Hi Nathaniel, Sorry to be late to the party, but I just noticed this because I was doing some research on Prestashop and SEO. (Considering helping a client on that platform.) I came across a person who might be able to help you - Lesley Paone. Lesley is "an expert in PrestaShop development and is also a PrestaShop certified developer. He is also one of the global moderators of PrestaShop's e-ecommerce forum, which is one of the largest e-commerce forums on the internet. He is also an SEO relating to e-commerce sites and is very active on the SEO forums for Moz as well."  You can find him on Twitter at @lesleypaone, on his site at dh42.com, and by private messaging him here on Moz. Good luck! .

    | DonnaDuncan
    0

  • Should I proceed with back linking and adding new content yet (one blog a week and a new review) or hold off for a couple more weeks? Content - Content - Content... I have a client who is in the eCig market at the moment and I can tell you that Google doesn't like traditional backlinks for these kinds of sites. This comes from having worked with others in the past too. You need to be getting inventive and getting some up to date content on there. Setup some alerts from Google based around eCig new, and when it comes through, hop on to your blog and get started with your own version of the news. Give insights - be controversial - write how-to guides and try and create an air that says "amazing content, oh, and they sell a product", not the other way around. You will be chasing your tail. Also, don't stop at 500 words - you need to be putting some real time into creating content that will make others (and Google) sit up and take notice. Then share the sh*t out of it in as many ways as you possibly can. I don't mean spam though - create a few well-thought-out call to actions and use these as a hook. I hope this helps. -Andy

    | Andy.Drinkwater
    0

  • Do not change the URLs unless necessary. The URLs you've described above are perfectly fine. Look at Amazon's URLs to see that it is absolutely OK to not have /product-name-here style URLs.

    | anthonydnelson
    0

  • Hi Kashif, Following this structure should get it done. RewriteRule ^subdirectory/(.*)$ /anotherdirectory/$1 [R=301,NC,L] For your example it would be like RewriteRule ^FolderA/FolderB/(.*)$ /FolderA/FolderC/FolderB/$1 [R=301,NC,L] For less then 100 redirects it "may" also be prudent to do them individually. I say this because it makes it very obvious what is going on if you ever need to edit it again. I would use Excel or OpenOffice to create a spreadsheet then just concatenate the cells together, following this example; Redirect 301 /path/to-old-url    https://www.mydomain.com/path/to-new-url Ref site: http://coolestguidesontheplanet.com/redirecting-a-web-folder-directory-to-another-in-htaccess/ Hope this helps, Don

    | donford
    0

  • Hi Karen, The main concern you have is ensuring that your blog follows best practices from a technical perspective when it comes to sitemapping. You are on the right track with www.yourdomain.com/blog instead of creating a subdomain to host your blog. This will allow for optimal link juice flow. In terms of naming conventions - you are getting into UX and as Russ has pointed out, that has to be properly applied to your audience. If you are primarily sharing news about your industry on your site, a /news/ title may be best. It may also suggest a higher degree of professionalism with regard to your audience, depending on the demographic. If you follow these steps, you will be in great shape. As with anything, the name has to coincide with the target audience. Best of luck moving forward! Rob

    | Toddfoster
    0

  • The concept that Google is trying to setup here is that your CSS and JS contain elements that are critical for the page to render.  The problem is that as the browser downloads them, they can block other resources from being downloaded.  This is because the browser wants to read these files to see everything they need to download to render the page. Part of fixing render blocking is to reduce the number of files that a browser has to download, especially those in the critical path (html, CSS, JS) that can block the downloading of other files (images, etc) Google is getting even more specific in your case.   They are looking at the "above the fold" parts of your page.  What Google wants you to do is take any CSS or JS that you use to render what is "above the fold" on that page and inline that code into your HTML file.   That way when the browser downloads the HTML file it has all it needs to render the visible / "above the fold" part of the page vs having to wait for the CSS and/or JS files to download. The problem is that defining "above the fold" is relative due to the multiple browser size and OS and devices that your web server sees on a regular basis. If you have a really good front end developer, they can take the time to figure out what viewport size is the most common and then take all the CSS and JS and inline that (and note this may be different depending on the page) into your HTML (and this assumes that your CSS and JS do not bloat your HTML file size too much).  One approach is to take your most common large viewport size and then inline all those items into your HTML that are above the fold so you have everything covered as the viewport gets smaller.  The issue there (and this is also with most responsive sites) is that you have a lot of code bloat for your phone browsers.  You can also use a sniffer to determine what the size of the viewport is and then having the appropriate amount of CS and JS inlined on the fly.  I have also seen people suggesting that we should design websites for the phone first and then expand out from there. This is the best website I have seen that talks about how all these files interact and what Google is really getting at https://varvy.com/pagespeed/critical-render-path.html Here is what I would do. Have a single CSS file for your site and host it on your server, not an external domain.  This is best practice.  Take the time to strip out all of the stuff you do not use out of the CSS to get the file size down.  Minify and compress it, reference your CSS in your header.  This may help with the render blocking as you are reducing the number of files requested to just 1, but it may not help with the above the fold render blocking. If you want to move forward with with "fixing" the above the fold render blocking.  Extract the CSS that is critical to render above the fold items on your site (noting the caveats above) and place it inline within your HTML file and then put the rest in your single CSS file. https://varvy.com/pagespeed/optimize-css-delivery.html Have a single JS file and host it on your server.  If there is any external JS try and see if you can host it within your single JS file.  Strip out all the JS you do not use to get the file size down.  Minify and compress it. If you want to get past the render blocking above the fold item above, figure out what JS is needed to render the page above the fold.  Inline that JS within your HTML and then setup a single file for all the other JS and then defer loading of that file using this technique: https://varvy.com/pagespeed/defer-loading-javascript.html I noticed your external JS file to Googleadservices.  You may not be able to put that JS into your main JS file and have to keep the external JS reference.  I would then try and defer the loading of that using the technique above.  You need to do some testing to make sure that doing this does not throw off how your ads are displayed or tracked.  I would also make sure your GA or other web tracking JS code is inlined as well, otherwise you risk throwing off your web stats. This is what makes all of this tricky.   The Google page speed tool is just looking at a list of best practices and seeing if they are present or not.  They are not looking to see if your page is actually getting faster or not, or if you change any of these things if they throw off the function of your site. https://developers.google.com/speed/pagespeed/insights/ PageSpeed Insights analyzes the content of a web page, then generates suggestions to make that page faster This is why with all of this you need to use a tool that shows actual page speed and will show a waterfall chart with timings to see how everything interacts.  webpagetest.org is a common one. It gets really complicated, really fast, and this is where a a really good front end guy or gal is worth it to look at these things. I would start with my initial simple suggestions above and not sweat the above the fold stuff.  Test your site with actual speed and see how it does.  You can also setup GA to give you page speed data.  You can then decide if you need to take it to the next level. Another thing you can try (I have not been able to get this to work for me) is that Google has a tool that can do all the "above the fold" inlining and other speed tricks for you https://developers.google.com/speed/pagespeed/module/ Just like above, I would benchmark your performance and then see if this makes a difference on your site. Good luck!

    | CleverPhD
    0

  • Did you get this sorted? looking at your page now, it appears that while you have a lot of occurrences of the same keywords (track 43, curtain 29, and curtain track 22), the density of them is quite low, 4% or less. This wouldn't be classed as keyword stuffing as far as i'm aware as you have plenty of other relevant content to dilute the frequency of them.

    | ben_dpp
    0

  • Well, you are on the right path with thinking on how to reduce the amount of unneeded pages. Here is how I would approach it. Check the query volume levels on those specific queries "Map to XXX" Check the search volume on those pages. See if you can detect a pattern that there is search volume to justify those searches, do they result in significant traffic to those pages.  Then try and determine, what is the content on all those separate pages and is it any good?  Are they making extra pages to make extra pages?  Sure, in theory you could do a page per query, but I would bet if they have a ton of hotel, all the info on those pages is a bunch of boilerplate crap copied from somewhere else.  Even if the search volume was there, do they have a good enough page with good content to rank for it? Now that we have hummingbird type algos in Google, it reduces the need to get so specific for matching on queries on a page by page basis.  Build a single, awesome, page that is really helpful to users and has original content, that is how you win for the big queries and then fill in for the rest. You can then use the title, description and H1, H2 headings to show the important information. Remember that the rel=canonical will help Google understand what your main page is and what your secondary/duplicate pages are in this specific case, but I am not sure that Google would see it as the consolidated awesome single page. Rel=canonical is more for showing how the parts are just parts of a whole page that is already there, it is more to help clean up duplicate content http://googlewebmastercentral.blogspot.com/2013/04/5-common-mistakes-with-relcanonical.html As I read your post originally, this seemed to be more of the issue.  This was why I was encouraging you to take the content in the tab (that did not seem substantial) and put it on the main page and use the canonical or if the content was junk then it did not matter as much. Hope that makes sense.

    | CleverPhD
    0

  • Hello, my friend. "Password"? I assume you mean keyphrase/keyword. Well, first of all, the keywords you're talking about are not difficult And yes, if you use "in" it won't matter. In fact, go for natural word flow, since it's better for UX, and that's what google likes. Additionally, you don't have to use the whole keyphrase in H-tags. If possible - sure, if not - just do it in content naturally. Google has very-very sophisticated AI, so no worries, they will understand what page is about. Just don't go overboard with keyword stuffing.

    | DmitriiK
    0

  • Totally, I hope there are no issues though, it would be a mess.

    | LesleyPaone
    0

  • Thanks Moosa- Appreciate all the inputs. Any one if you guys available as a freelancer for some guidance on how I should formulate my content strategy moving forward?

    | tbreak.ae
    0

  • I agree with Moosa, but would add another layer of analysis on this.  You need to do a type of content audit on these pages.   You can go into GA to just look at traffic or use a tool (my new favorite) called URL Profiler to pull GA and OSE and Majestic link data plus social media shares on your sold product URLs. For each sold product page you have two options: 301 the page - If the "sold' page generates a fair amount of traffic, and even has some links coming in, you probably want to 301 redirect it.  The 301 needs to be to a page with similar semantic information.  Ideally this would be a main product category page so you can build the traffic to that page. Example: If you sell Ford Mustangs and you have a lot of "sold" pages of Ford Mustangs, and those sold pages still get traffic and even have some links to them, you want to 301 those pages to your main category Ford Mustang page that has links to all of your Ford Mustangs that are still for sale.  Good for Google, good for the user.  Key point is that a) the "sold" pages are still generating value (traffic and links) and b) that you redirect to a semantically related page.  If you use a 301 to redirect to the home page that is a bad idea as it needs to be closely related in topic.   When looking at a "Ford Mustang for sale" page, a multiple car brands for sale page (your home page theme) is not as closely related to it as your main Ford Mustangs for sale category page.   It would also behoove you to really make that Ford Mustang for sale category page really kick a$$ content wise. 410/404 the page - If you find a large group of "sold" pages that do not get very much traffic to them and/or they do not have much link equity, just let them 404/410.  Show a helpful not found page with links to other sections of your site and even a search function.   FYI - I like the 410 directive as it is a "permanently gone" directive vs a temporary one. How do you get the data I am talking about on all the "sold" pages?  Using a tool like URL profiler, you put in all of your "sold" URLs and the software uses an API to get data from Search Console, GA, OSE and Majestic (among other tools) and pulls them into a single line per sheet.  You can then look for each URL any of the data and determine if you use option #1 or option #2.   Moz has an article on how to do a content audit you can search the web for other examples.  A simpler version of this would be to use the advanced search within GA and pull the organic landing page traffic on those pages. Some SEOs would say that option #2 is blasphemy. ie. you always want to 301 redirect.  Why would you ever want to lose traffic by setting up a 410?  That will cause errors to show up in Search Console!  You will lose link equity and traffic! This is why you have to perform the content audit.   If you have 1,000 pages but 800 of them send next to no traffic to your site and have generated 2 links, you can 410 the 800 pages and never notice the difference. You will not miss the traffic or links as they had none to begin with!   All those 800 pages are doing is wasting your crawl budget with Google and giving a signal to Google that you have a bunch of low quality pages on your website.  Also, don't panic when you see all the 410 in Search Console.  Just sort by date and then by priority to make sure that these are all the pages that you want to 410.  Over time (about 3-4 months) they will naturally fall out.

    | CleverPhD
    0

  • I am working on an Online Digital Marketing Magazine and many times I am explaining a concept or something about marketing. If I am talking about Topic "A" and I explain what it means and why it is important, I like embolden the topic I am explaining and put the explanation in italics. In my view, this is for the purpose of skimming the article to pick out the more important parts. Additionally, since the topics being explained are the targeted keywords I figured it would add to SEO, I just hope it doesn't hurt rankings but the ever looming over-optimizing threat. Not sure on any quantitative limits on bold and italicized text but would love to see some numbers on this topic. I guess I am approaching this with a usability and reader perspective but everyone is different, and I know someone will look at the article and be like oh no why!>? Please let me know if you think this way or a good reason to stop thinking this way. yzJ18D2.png

    | LearnInternetGrow
    0

  • Hello Emil - to answer your questions: On-Site Content For starters, your best bet is probably to initiate a campaign to produce a fair amount of static content for your website which will form the "backbone" of your on-site ranking factors. For any site, you want to make sure your Home Page and any landing pages feature a fair amount of content (estimates range between 500 and 2000 words, on average) in order to maximize ranking potential. Text-based content is huge, especially if your site is primarily based in images/Javascript, etc. Next Steps: Create a categorical page and ensure all images are unique/receive alt-text This is the way to go if you get significant traffic to your website but not your social media accounts, or you don't have the time/resources to be on social media all the time. The idea here is to create static content on your site which Google typically reacts more positively to. Create a blog for images and share on social media platforms This is the way to go if you want to extend brand awareness and build it up. From what you tell me, this sounds like something that is occurring naturally and something you can leverage for future traffic/rankings. Social media signals don't directly impact rankings, but they do create traffic which can produce positive signals to Google if your content is well-received on social media. This requires a bit more time and dedication to social media, but the results are there for the taking. It also doesn't require lengthy articles. Social Media Platforms: In terms of what social media outlets to use - Instagram and Pinterest are great places to start, and you can utilize twitter and Facebook as well. If you have something more business-oriented, linkedin is a decent option. For you, though, Instagram should be your bread and butter. There's no reason why you can't have a blog on your domain and an instagram account with the same set of images, etc. Duplicate content is a bit of red herring as it tends to be seen as a penalty-causing factor when in reality it is generally neutral to a website's rankings. The only way this would cause concern to me is if you are using spammy tactics or literally duplicating another website verbatim. If your content is reoccurring on your site (and no one else's) then you are probably fine from a Panda perspective. Hope this helps and feel free to bounce any ideas off of me moving forward - love to help! Rob

    | Toddfoster
    0