What impact? I am not sure I understand the question.
Posts made by max.favilli
-
RE: Help needed on HTTPS
-
RE: Does Google really care about cheaters?
In theory a spam report should be the thing to do, but usually it doesn't change anything.
I remember seeing a video of either Cutts or Muller saying that a spam report is not going to produce any direct effect, they just use the information to improve their algo on next update.
I know for sure that in certain cases they take immediate action, because months ago a news site with DA90+ here in Italy started selling do-follow links in their articles (by the way for a small fee of 15k euro per month!), I reported that, and few others did, and within one month they got a manual action in their gwt account and they stopped doing it.
But in many other cases where both I and colleagues in the business did spam report for much smaller sites, nothing happen, and after years I can say they are still happily enjoying good ranking for very obvious black hat tricks.
SEO and democracy doesn't seems to get along well.
-
RE: How do you link your adaptive mobile site to Google Analytics?
I was just addressing that few days ago.
Google have been kind enough to provide a very detailed guide on how to do that: https://developers.google.com/webmasters/mobile-sites/mobile-seo/configurations/separate-urls
You need to annotate your html accordingly, all the details are there, just read it.
-
RE: Is there any data available on how deeplinking contribute to improve domain authority?
In this whiteboard friday Rand is addressing this topic, and it's interesting that many (diverse) deep-links from same domain are quite valuable.
http://moz.com/blog/whiteboard-friday-link-diversity
But yet I can't find any clue about the effect on DA.
-
RE: Is there any data available on how deeplinking contribute to improve domain authority?
Maybe my question is not clear enough. I am not asking about a strategy to get backlinks. I know how to get backlinks.
What I don't know is if there's any analysis, study, data, test... Supporting any particular strategy on how to diversify backlinks among home and internal pages.
In other words, to be more clear... Let's make two examples (just for the sake of being understood).
Is there any analysis, study, data, anything... Saying that is generally better to have 50% of backlinks to homepage and 50% to internal pages. Or is better to have 90% of backlinks to homepage and 10% to internal pages, etc...
-
RE: Is there any data available on how deeplinking contribute to improve domain authority?
I know DA is increasing as well. I can see that. But I have no idea of what correlations are there. And if there's any best practice or strategy to maximize benefit for both.
-
Is there any data available on how deeplinking contribute to improve domain authority?
Let's say you are collecting backlinks across a variety of pages for foo.com, like: foo.com/clown, foo.com/juggler, foo.com/lion-tamer and so on...
Page Authority will benefit for each of those pages, but what is going to happen to Domain Authority?
Is there any data available? Any educated guess in what's the best strategy to try to get best results for both PA and DA?
-
RE: Crawl budget
What's the background of your conclusion? What's the logic?
I am asking because my understanding about crawl budget is that if you waste it you risk to 1) slow down recrawl frequency on a per page basis and 2) risk google crawler gives up crawling the website before to have crawled all pages.
Are you splitting your sitemap into sub sitemaps? That's a good way to spot groups/categories of pages being ignored by google crawler.
-
RE: Does Moz offer a tool to measure competitor web visitor traffic?
There's few tools online offering traffic estimate service. In my opinion none is accurate enough to be relevant. you just need to try them with url of websites you have log access to see how poor their guess are.
as far as i know moz doesn't offer such a tool
if i have to name one of those tool i would name semrush, when I tested it's accurancy I found it to report traffic volumes between 10% and 100% wrong, which is more accurate than many others and it's still giving you an idea of a website traffic
-
RE: Google Frequently Indexing - Good or Bad?
It's normal.
Google as a crawl budget.
Once your website is indexed (sitemap submission) the crawler visits your website daily and depending on a number of factors does crawl a portion of your website only, a different yet somehow overlapping portion, everyday. It usually crawl every single page within few weeks but I have seen the crawler stubbornly refuse to crawl certain pages for longer.
A small website with just 100 pages it's unlikely to hit any crawl budget.
When it comes to fresh content, new pages, it's more hungry, google crawler likes fresh food, and google algo tries to index fresh content as soon as possible.
So what's happening to your website is perfectly normal.
-
RE: What was your experience with changing site url's?
Is the content staying the same?
If the content is staying the same, and just the url structure is changing, and if the new url structure is not depriving the url of the relevant keyword for the page content...
If all these conditions are satisfied, in my experience moving few websites, replacing only the url structure (theoretically improving it), nothing changed, I mean I didn't experience any traffic being lost in GWT or analytics, I didn't even notice any unusual fluctuation.
Of course keep in minds the traffic is moving, from the old urls to the new ones, so depending on your report and your new url structure you will see it in different places.
-
RE: Fresh content..how important to SERP position?
My experience improving page copy tells me the ranking improvement is not dramatic and the exact amount depends on a variety of factors, as far as I can tell the biggest of these factors are domain authority and keyword competition, if you change content on pages of a domain with a high DA, the ranking of those pages will improve more if compared with the improvement you may get with lower DA pages.
Also the improvement seems proportional to the previous SERP position, in other words it's easier to go from pos 20 to first page, but much harder to go from pos 3 to pos 2; considering all other factors the same.
Of course if you have a page with a a lot of backlinks and a decent ranking despite the content being thin, or poorly related to the keywords you are targeting, improving it will gives you the biggest jump.
Anyway that's just my experience, no golden rule.
EDIT, now I read EGOL answer I can see I totally missed the real core question.
I have never seen a page ranking improve just because a new fresh content replaced the old one. Only because new better quality content replaced old poor content.
And when I heard and read about fresh content boost, they were always talking about new content in new pages, and that depends on the query being searched. If you browse old whiteboard friday from Rand you will find one on this topic which explain that.
In one sentence, if you publish new fresh content about a sport event the night of that sport event you may benefit from freshness boost for query related to that sport event. For obvious reasons google believe new content about a recent event is more relevant than stale one.
-
RE: Exact keyword match for meta title and h1 what is best practice?
targeting the keywords with the highest volume can be misleading, they are not necessary the best converting keywords, and since usually everyone prefer to shot at the biggest target competition is fierce
I am not saying it's wrong, but I would be more selective and choose keywords on the base of a variety of factors.
-
RE: Domain Authority? Why is it declining according to Moz?
In addition to the good things said by others
check your backlink profile versus your competitors, probably they are getting more backlinks or more authoritative ones, or maybe your old backlinks are poor quality and dropping in value,
check your backlink profile for anchor diversity and cross check with your competitors
are you having traffic drops? Maybe not overall, but hidden in some traffic segments? are those drops linked to a dropping ranking for some keywords? If so why? Increasing competition? Due to some backlink related issues? See above.
is your content quality dropping compared to your competitors? are you serving pages with thin content and your competitors are offering better content? Less duplicate? More targeted?
i could go on, but this is at least where I would start my homework
-
RE: Our client's web property recently switched over to secure pages (https) however there non secure pages (http) are still being indexed in Google. Should we request in GWMT to have the non secure pages deindexed?
Redirect with 301 the old urls to the new https ones. Check gwt for any crawl errors for weeks if not months (depends on how much index pages you have and how many are crawled per day) and fix them, if you have.
you should then see traffic move from http to https in gwt
again depending on the total number of pages indexed you have and the crawling rate you have, it make takes months, even many months before all pages are re-indexed and starts showing in SERP as https
but that's normal and it's not bad at all
-
RE: Subdomains vs. Subfolders vs. New Site
i don't see any reason, in your exposition, to choose any different route than first option
check the domain authority of the second level domain your client is currently using and evaluate how much time and money you need to replicate the same DA to a new domain, then ask yourself and your client if it's worth it. Don't forget to mention all that work to rank for the new domain is not going to help the old one, and when same DA is reached you will still have to face duplicate effort to improve DA of both.
of course there are other factors which could still weight in favor of the separate domain, like banaly if she want to take distance from her main practice, if the existing domain suffered from penalization, etc...
but without knowing much I don't see a reason to go that way
-
RE: Social plugin making URLs wonky, would this hinder SEO efforts?
Do you mean the plugin is redirecting all the traffic to that horrible url? Or just redirect users after clicking to share? Or that url is the one the plugin shares?
damage change depending on the answer
but I would get rid of the plugin anyhow
-
RE: Exact keyword match for meta title and h1 what is best practice?
I have seen google algo rank the same page for very similar phrases where the keywords where just changing places, but it's not "always" doing it. You can help him learn that page is a good fit for both queries with the copy of the page.
google knows about synonyms and does often show same or very similar SERP for different keywords which are considered to have the same "exact" meaning and usage in language. But in my experience does treat plurals in a different way.
I would use exact match for title and exact match for h1 if it make sense, but avoiding using unnatural language.
Unfortunatelly SEO is a land where certainity is scarce, and I would recommend testing different versions to find out what works best in your case.
-
RE: Best way to find the best keywords to write Q&A
That's one of those cases where I would use buzzsumo, see what is most shared, and create similar content around those same keywords.