Author Topic: Oct Real World Whole Product results - Avast is last!  (Read 29437 times)

0 Members and 1 Guest are viewing this topic.

Offline Lisandro

  • Avast team
  • Certainly Bot
  • *
  • Posts: 67194
Re: Oct Real World Whole Product results - Avast is last!
« Reply #30 on: November 20, 2011, 07:49:48 PM »
This is also why I don't consider the sorting fair - it should be sorted by the red column, not the green column.
You're right as the default autosandbox option is to auto-decide (block), isn't it?
The best things in life are free.

Offline Vlk

  • Avast CEO
  • Serious Graphoman
  • *
  • Posts: 11655
  • Please don't send me IM's. Email only. Thx.
    • ALWIL Software
Re: Oct Real World Whole Product results - Avast is last!
« Reply #31 on: November 20, 2011, 08:16:16 PM »
This is also why I don't consider the sorting fair - it should be sorted by the red column, not the green column.
You're right as the default autosandbox option is to auto-decide (block), isn't it?

Yes, I'd say the messaging is quite strong (i.e. I don't quite see too many people getting infected by overriding the autosandbox option).

This will be further refined in v7 where the autosandbox works a bit differently.

Thanks
Vlk
If at first you don't succeed, then skydiving's not for you.

Offline Lisandro

  • Avast team
  • Certainly Bot
  • *
  • Posts: 67194
Re: Oct Real World Whole Product results - Avast is last!
« Reply #32 on: November 20, 2011, 08:19:31 PM »
This will be further refined in v7 where the autosandbox works a bit differently.
Can you anticipate the news?
The best things in life are free.

ady4um

  • Guest
Re: Oct Real World Whole Product results - Avast is last!
« Reply #33 on: November 20, 2011, 08:49:45 PM »
This will be further refined in v7 where the autosandbox works a bit differently.
Can you anticipate the news?

I hope he won't. Anticipation means that he would generate expectations. And it is way too early for expectations. Is not that we will have something to say or to do with that info. Let's wait with patience for the future beta, whenever it gets to be really ready for actual practical use.

Offline Lisandro

  • Avast team
  • Certainly Bot
  • *
  • Posts: 67194
Re: Oct Real World Whole Product results - Avast is last!
« Reply #34 on: November 20, 2011, 09:12:30 PM »
Anticipation means that he would generate expectations.
No, anticipations means for me, discussion, participation, help improvement, build the software with others' opinions. This is what makes me stay here in forum and participate.
The best things in life are free.

ady4um

  • Guest
Re: Oct Real World Whole Product results - Avast is last!
« Reply #35 on: November 21, 2011, 01:21:29 AM »
Anticipation means that he would generate expectations.
No, anticipations means for me, discussion, participation, help improvement, build the software with others' opinions. This is what makes me stay here in forum and participate.

Sure, when the beta is ready exactly for that purposes :).

dagrev

  • Guest
Re: Oct Real World Whole Product results - Avast is last!
« Reply #36 on: November 21, 2011, 01:33:22 AM »
How do they translate into avast?
What are the dependent settings?
Behavior Shield and Autosandbox.

In avast's case, it's just the autosandbox.

This is also why I don't consider the sorting fair - it should be sorted by the red column, not the green column.

Thanks
vlk

Good piece of information that provides some perspective.

DBone

  • Guest
Re: Oct Real World Whole Product results - Avast is last!
« Reply #37 on: November 21, 2011, 01:56:09 AM »
If it's sorted by the red column, then we're 2nd to last, hardly bragging material.  ;)

true indian

  • Guest
Re: Oct Real World Whole Product results - Avast is last!
« Reply #38 on: November 21, 2011, 08:32:18 AM »
I'd like to take this opportunity and comment little bit more about the test and the results.

First, let me say that the recent results (especially the Oct 2011 results) don't make me too happy. You're right that avast did quite poorly, and this needs to be fixed.

However, I have to object the accuracy of the name of this test (or other "Real-World" tests I have seen recently, for that matter). For starters, the test is done as follows: every day (or every few days), the tester pastes a few links to the browser's address bar. The links point to executable files that are downloaded from the Internet and then executed on the computer (unless the tested AV stops them). Thus, all modules, includes the Network and Web Shields are exercised, which makes the test much more "real-world" than the traditional on-demand tests. So far so good. That said, there's one principal problem here, which is - in the real-world scenario, how does the user (and/or the browser itself) get to these URLs? That is, why would the user (in a real-world scenario) paste such a link to the browser? What happens in the real-world is that the browser is typically redirected to such a URL through an exploit - most likely a Javascript exploit pack or something similar. And that's the part of the equation that the test completely ignores. It has been shown many times that one of Avast's main strengths is in its detection capabilities of such malicious scripts/redirectors which are generally working as the entry points for the actual malware (the payload). Our primary emphasis lately has been blocking of those entry points instead of focusing so much on the actual binaries which would then be downloaded to the computer if the exploit got executed.

Another thing that is worth discussing is the size of the test set. If you look at the chart, it may look terrible, but what it means in absolutely numbers is that avast missed 18 samples. Eset missed 11. AVG missed 10. Avira missed 8. Are the differences statistically significant? They probably are, but it certainly doesn't look like such a disaster when you look at the numbers like this.


In any case, we do understand the importance of doing well in these tests (and we do even have a limited understanding of the necessity of protecting our users better against those manually downloaded malware binaries) and that's why we're making some important changes in our upcoming avast version 7, due in Q1 next year. It will be very interesting to see whether the new version will live up to its expectations. Fingers crossed.


Thanks
Vlk

VLK i trust the avast! team..we had a bad luck this time...but we know we can be the best and avast! is still going to be a great choice of security as it has been consistently excellent in its previous av tests...remember in the last on demand tests we came 3rd in detection rates and pulled down avira so this bad performance is just a bad luck keep improving avast! and make it the unbeatable naruto LOL! ;D....we have done the same in our past...i am going to continue using avast! protection as it is the best AV i ever had!
« Last Edit: November 21, 2011, 08:37:59 AM by true indian »

Hellion

  • Guest
Re: Oct Real World Whole Product results - Avast is last!
« Reply #39 on: November 21, 2011, 09:15:54 AM »
Hi All,

I really like benchmarks and shiny graphs and I base most of my trust on statistics, but there is always that thought in my mind saying what if those other competitors payed off these "independent testers" ?

I always need at least three sources to make a clear decision.

Look at this - http://www.av-test.org/en/tests/test-reports/julaug-2011/   (YES, I KNOW IT'S OLD.)

It's hard for me to lose faith in something when there is only one bad review...

Quick Edit.

Back to OP's graph.
Set the graph to show from Jan - Nov and see the "Overall" result for this year so far
« Last Edit: November 21, 2011, 09:21:36 AM by Hellion »

IBK

  • Guest
Re: Oct Real World Whole Product results - Avast is last!
« Reply #40 on: November 21, 2011, 10:19:18 PM »
I'd like to take this opportunity and comment little bit more about the test and the results.

First, let me say that the recent results (especially the Oct 2011 results) don't make me too happy. You're right that avast did quite poorly, and this needs to be fixed.

However, I have to object the accuracy of the name of this test (or other "Real-World" tests I have seen recently, for that matter). For starters, the test is done as follows: every day (or every few days), the tester pastes a few links to the browser's address bar. The links point to executable files that are downloaded from the Internet and then executed on the computer (unless the tested AV stops them). Thus, all modules, includes the Network and Web Shields are exercised, which makes the test much more "real-world" than the traditional on-demand tests. So far so good. That said, there's one principal problem here, which is - in the real-world scenario, how does the user (and/or the browser itself) get to these URLs? That is, why would the user (in a real-world scenario) paste such a link to the browser? What happens in the real-world is that the browser is typically redirected to such a URL through an exploit - most likely a Javascript exploit pack or something similar. And that's the part of the equation that the test completely ignores. It has been shown many times that one of Avast's main strengths is in its detection capabilities of such malicious scripts/redirectors which are generally working as the entry points for the actual malware (the payload). Our primary emphasis lately has been blocking of those entry points instead of focusing so much on the actual binaries which would then be downloaded to the computer if the exploit got executed.

Another thing that is worth discussing is the size of the test set. If you look at the chart, it may look terrible, but what it means in absolutely numbers is that avast missed 18 samples. Eset missed 11. AVG missed 10. Avira missed 8. Are the differences statistically significant? They probably are, but it certainly doesn't look like such a disaster when you look at the numbers like this.


In any case, we do understand the importance of doing well in these tests (and we do even have a limited understanding of the necessity of protecting our users better against those manually downloaded malware binaries) and that's why we're making some important changes in our upcoming avast version 7, due in Q1 next year. It will be very interesting to see whether the new version will live up to its expectations. Fingers crossed.


Thanks
Vlk

Hi Vlk, I disagree with you.
1) only about the half is pointing directly to binaries/files. The rest are exploits. In your misses you for sure also encountered some exploits and not only direct links. The "problem" is (and it is even written in the report) that practically all products (including of course Avast) are good are blocking/detecting exploits/drive-by downloads. That's also why the % are so high. If you look at the latest research of Microsoft, the biggest issue for users are not 0-day exploits (according to their paper its even close to 0%) but social-engineered malware, which includes also tricking users in clicking on links pointing to files. If you miss malware from the web, the test will and does reflect that. But I am glad to hear that the next version will improve further in this regard.
2) too less samples: others use 10 samples for such a test and base ratings based on that. We use usually 50x that size. Arguing that sample size is too small doesn't sound fair. If it would be 1 million someone would say "who surfs to 1 million malicious sites...?" missing the whole point.
3) How user-dependent cases are interpreted is up to the user. I do not believe that a product which would ask the user for everything should get the same like a product which is able to distinguish between malware and goodware without letting the decision up to the user. Anyway, only on chart2 you can sort based on the green bar. In chart3 you can combine blocked+userdependent.
4) I expected that also Whole Product Dynamic Tests would be criticized (like any other test) in future if the scores are unfavorable for someone, despite the internal promotion for such sophisticated tests.
« Last Edit: November 21, 2011, 10:29:52 PM by IBK »

lord ami

  • Guest
Re: Oct Real World Whole Product results - Avast is last!
« Reply #41 on: November 21, 2011, 10:28:14 PM »
VLK spoke about script blocking etc. I myself install avast in my relatives PCs and I recently got a big thanks because of avast!. avast! indeed blocked a Facebook link that tried to exploit something and install ZerkAccess rootkit. The link was blocked in it's roots as soon as it was clicked on. That's the true protection avast! offers :)

The same link, in the other hand, was opened in PC that had no protection at all. It was hard to remove the Rootkit because it made things very hard inside windows ^^

Offline Vlk

  • Avast CEO
  • Serious Graphoman
  • *
  • Posts: 11655
  • Please don't send me IM's. Email only. Thx.
    • ALWIL Software
Re: Oct Real World Whole Product results - Avast is last!
« Reply #42 on: November 21, 2011, 11:45:23 PM »
Hi IBK,

Many thanks for coming here and taking your time to respond. It's always good to see you here (for starters: IBK is the person behind Av-Comparatives.org).


1) only about the half is pointing directly to binaries/files. The rest are exploits. In your misses you for sure also encountered some exploits and not only direct links. The "problem" is (and it is even written in the report) that practically all products (including of course Avast) are good are blocking/detecting exploits/drive-by downloads. That's also why the % are so high. If you look at the latest research of Microsoft, the biggest issue for users are not 0-day exploits (according to their paper its even close to 0%) but social-engineered malware, which includes also tricking users in clicking on links pointing to files. If you miss malware from the web, the test will and does reflect that. But I am glad to hear that the next version will improve further in this regard.

Fair enough. Social engineering for sure is an important attack vector, and can indeed lead to users directly running binaries. However, I don't think that a typical social engineered attack does that (have the user download and manually run a binary).

BTW would you mind sharing a link to that MS report you're referring to?

2) too less samples: others use 10 samples for such a test and base ratings based on that. We use usually 50x that size. Arguing that sample size is too small doesn't sound fair. If it would be 1 million someone would say "who surfs to 1 million malicious sites...?" missing the whole point.

All I was saying that Avast missed 18 samples while e.g. product B and product C missed 11 and 10, respectively. Without talking about other tests (which of course deserve same - or even bigger - criticism also) I'm just questioning the statistical relevance of the numbers. No pun intended.

3) How user-dependent cases are interpreted is up to the user. I do not believe that a product which would ask the user for everything should get the same like a product which is able to distinguish between malware and goodware without letting the decision up to the user. Anyway, only on chart2 you can sort based on the green bar. In chart3 you can combine blocked+userdependent.

This is probably the part where I'm most frustrated with the test. I just somehow disagree with the yellow category, simply because it tries to encompass all cases where the user has some control over the final decision. In the case of avast autosandbox, the message is so imperative, and has such a clear recommended action that I don't quite see a user deliberately overriding the default decision and actually getting infected. But anyway, as I've already said, we're refining the Autosandbox in v7 and so we'll probably move these files to the green category.

4) I expected that also Whole Product Dynamic Tests would be criticized (like any other test) in future if the scores are unfavorable for someone, despite the internal promotion for such sophisticated tests.

And that's fine, isn't it? Criticism is generally a good thing, if it is material.

Thanks
Vlk
If at first you don't succeed, then skydiving's not for you.

Offline Asyn

  • Avast √úberevangelist
  • Certainly Bot
  • *****
  • Posts: 76029
    • >>>  Avast Forum - Deutschsprachiger Bereich  <<<
Re: Oct Real World Whole Product results - Avast is last!
« Reply #43 on: November 22, 2011, 12:00:52 AM »
BTW would you mind sharing a link to that MS report you're referring to?

It's already here. ;)
http://forum.avast.com/index.php?topic=66267.msg697826#msg697826
W8.1 [x64] - Avast Free AV 23.3.8047.BC [UI.757] - Firefox ESR 102.9 [NS/uBO/PB] - Thunderbird 102.9.1
Avast-Tools: Secure Browser 109.0 - Cleanup 23.1 - SecureLine 5.18 - DriverUpdater 23.1 - CCleaner 6.01
Avast Wissenswertes (Downloads, Anleitungen & Infos): https://forum.avast.com/index.php?topic=60523.0

IBK

  • Guest
Re: Oct Real World Whole Product results - Avast is last!
« Reply #44 on: November 22, 2011, 09:06:38 AM »
All I was saying that Avast missed 18 samples while e.g. product B and product C missed 11 and 10, respectively. Without talking about other tests (which of course deserve same - or even bigger - criticism also) I'm just questioning the statistical relevance of the numbers. No pun intended.

I know about the numbers, that's why I personally wait for the overall report which is released after 4 months (see comment on page 9 of previous overall WPDT report), and then start with the statistics.