Author Topic: AV-Comparatives Test February/March 2009  (Read 7762 times)

0 Members and 1 Guest are viewing this topic.

onlysomeone

  • Guest
AV-Comparatives Test February/March 2009
« on: March 22, 2009, 08:37:41 PM »
New test-results from AV-Comparatives are available!

Avast! took the 5th place in case of detection with a detection rate of 98,2%.
Because of 28 false positives Avast! only could reach the "Advanced" level...
In the case of on-demand scanning speed Avast! could take the 4th place with an average speed of 15,4 MB/second.

I'm proud to say, well done Alwil, and thank you for this great product!

yours
onlysomeone

Offline Lisandro

  • Avast team
  • Certainly Bot
  • *
  • Posts: 67194
Re: AV-Comparatives Test February/March 2009
« Reply #1 on: March 22, 2009, 08:49:27 PM »
At least, the on-demand scanning speed is improved ;)
The best things in life are free.

Offline DavidR

  • Avast Überevangelist
  • Certainly Bot
  • *****
  • Posts: 89053
  • No support PMs thanks
Re: AV-Comparatives Test February/March 2009
« Reply #2 on: March 22, 2009, 09:16:38 PM »
Well this is the introduction of the new policy on false positives which effectively dropped avast from advanced+ to advanced with a 98.2 detection rate and all that without heuristics :P

However, the generic signature win32:trojan-gen {Other} was responsible for most of the FPs, so there is a price to pay for generic/algorithmic detections, in much the same way as other AV got FPs with heuristics.

Hopefully with the introduction of the behaviour module in avast 5 the FPs can be further reduced.
Windows 10 Home 64bit/ Acer Aspire F15/ Intel Core i5 7200U 2.5GHz, 8GB DDR4 memory, 256GB SSD, 1TB HDD/ avast! free 24.3.6108 (build 24.3.8975.762) UI 1.0.801/ Firefox, uBlock Origin, uMatrix/ MailWasher Pro/ Avast! Mobile Security

Offline Omid Farhang

  • Frontend Developer
  • Avast Evangelist
  • Super Poster
  • ***
  • Posts: 1660
  • I wish I could write longer personal text!!
    • Homepage
Re: AV-Comparatives Test February/March 2009
« Reply #3 on: March 22, 2009, 09:45:28 PM »
ahhh.... Symantec are working so much, same about McAfee... I know avast! users would be loyal, but... about new users, they may think bad about avast and... you know! I hope avast! bring something good new... I don't want to  see avast! fail!! I don't know how I can help alwil team as a simple member else than recommended my friends to use avast! and buy avast! professional, personally I am using avast! in my Laptop and my U3 flash drive :)

I am waiting for the new things that alwil team bring! ;)
Twitter: OmidFarhangEn - OS: Manjaro KDE

kubecj

  • Guest
Re: AV-Comparatives Test February/March 2009
« Reply #4 on: March 22, 2009, 09:51:55 PM »
We were better in both detections and lowered numbers of falses.

The penalties for FPs are unfortunate, because the quality of AVC's cleanset is not good enough for making any kind of assumptions. I don't underestimate falses, they're bad for us and for users, I'm just saying that it's hard to measure. This basically says that on the limited AVC cleanset we caught this number of files, it means nothing else. I could easily prepare cleanset where all other products would fail miserably.

IBK

  • Guest
Re: AV-Comparatives Test February/March 2009
« Reply #5 on: March 22, 2009, 10:09:27 PM »
"This basically says that on the limited AVC cleanset we caught this number of files, it means nothing else." exactly. on a independent set of clean files you had more FPs than compared to other vendors in this test.

Offline Tarq57

  • Avast Evangelist
  • Massive Poster
  • ***
  • Posts: 3695
  • If at first you don’t succeed; call it version 1.0
Re: AV-Comparatives Test February/March 2009
« Reply #6 on: March 22, 2009, 10:19:31 PM »
Tell the truth, I'd far prefer to have the good detection rate and deal with the occasional FP than have to tackle the far greater inconvenience of an infected computer.
So far, using Avast, the second hasn't occurred, so I'm real happy.
Windows 10,Windows Firewall,Firefox w/Adblock.

Offline igor

  • Avast team
  • Serious Graphoman
  • *
  • Posts: 11849
    • AVAST Software
Re: AV-Comparatives Test February/March 2009
« Reply #7 on: March 22, 2009, 10:22:58 PM »
"This basically says that on the limited AVC cleanset we caught this number of files, it means nothing else." exactly. on a independent set of clean files you had more FPs than compared to other vendors in this test.

On the other hand, your other test set (the infected one) is independent as well (I hope?) - so I don't see a good reason not to move the files, identified as false positives in there, into the other set. (Unless... somebody would be unhappy about the results?)

normishmael

  • Guest
Re: AV-Comparatives Test February/March 2009
« Reply #8 on: March 22, 2009, 10:29:09 PM »
kubecj said:
"We were better in both detections and lowered numbers of falses."

Thats right. The test looks good to me.
Somewhat off topic,I know,but it may be time to prepare for yet another post AVG8
flood of new Avast! users.
The new Avira 9 finial seems to be early beta buggy,and while they will no doubt fix a lot of
it,their forums seem pretty well filled with stressed and disappointed.

ilker

  • Guest
Re: AV-Comparatives Test February/March 2009
« Reply #9 on: March 22, 2009, 10:30:01 PM »
avast! is faster than antivir according to the test:D I'm very happy with the result. Good work Alwil Team  8)

IBK

  • Guest
Re: AV-Comparatives Test February/March 2009
« Reply #10 on: March 22, 2009, 10:35:57 PM »
we already discussed this. the clean set contains files which we know where they come from (our servers, customers data, media disks, etc.) and that they are kind of "widespread". so we do not accept for the clean set submissions of FPs. i know vendors have tousands of FPs of other vendors and instead of sharing the FPs with them in order that they get fixed, some vendors (not talking about you) may try to poison the sets in order that also other vendors get no award if they do not get one.
I can understand that Avast is unhappy of not getting A+ this time, but users which do not care about FPs or awards can anyway go for the detection results alone if they want to. But we are not going to change some basic rules just because you are unhappy. after all, some other testers do not give you an award if you just have one FP.
Avast showed in this test an improved scanning speed and higher detection rates than in last August test (SET B). false alarms were this time unfortunatly too high for the harder to achieve A+, but we expect that Avast will have fewer FPs next time (Avast fixed the ones we sent them) and as far as I heard and understand, they use now some automated processes to better avoid clean files getting detected.
« Last Edit: March 22, 2009, 10:47:22 PM by IBK »

Offline igor

  • Avast team
  • Serious Graphoman
  • *
  • Posts: 11849
    • AVAST Software
Re: AV-Comparatives Test February/March 2009
« Reply #11 on: March 22, 2009, 10:47:09 PM »
some vendors (not talking about you) may try to poison the sets in order that also other vendors get no award if they do not get one.

Honestly, even if it was the case, that's not an excuse - no matter how the false positives got there, they affect the users. The vendors should not blindly add everything they receive, they should do some checking on their own.
If somebody starts to behave nasty and sends other vendors samples of important system libraries, drivers, whatever [as infected files]... I guess the users with corrupted operating systems wouldn't be satisfied with the answer "some other vendor poisoned our collection of samples".

I can understand that Avast is unhappy of not getting A+ this time

That's not what bothers me (personally), really. What I don't like is that other vendors, with hundreds of false positives identified on your set, are listed as "few FPs" in the result.

kubecj

  • Guest
Re: AV-Comparatives Test February/March 2009
« Reply #12 on: March 22, 2009, 11:07:02 PM »
"This basically says that on the limited AVC cleanset we caught this number of files, it means nothing else." exactly. on a independent set of clean files you had more FPs than compared to other vendors in this test.

That's true and I've never denied that.
My problem is that you are assuming something (penalties) which is from my point of view incorrect, regarding to the state and size of the cleanset. I showed you that this assumption is wrong and that there are lots of _clean_ and _verifiable_ files in your malwareset caught by many products, but you refused to use this info in any way.

Mr.Agent

  • Guest
Re: AV-Comparatives Test February/March 2009
« Reply #13 on: March 22, 2009, 11:11:35 PM »
98,2 % ? Seriously wow nice i cant believe that ! Nice job ALWIL

You made me surprised guy ! Keep up the great work ALWIL !

Oh yeah i missed a thing thx for this great product im sure that pro version rock more ! :D

Mr.Agent
« Last Edit: March 22, 2009, 11:33:57 PM by Mr.Agent »

Offline Lisandro

  • Avast team
  • Certainly Bot
  • *
  • Posts: 67194
Re: AV-Comparatives Test February/March 2009
« Reply #14 on: March 23, 2009, 12:02:17 AM »
We were better in both detections and lowered numbers of falses.
Yeah... I've received the following IM of a technician:

Quote
detection rate of avast improved also, at least if you would compare the percentages from the august test with the one of february (note: you must look on SET B only), so it is from 97,3% (august) to now 98,2% (february)....
The best things in life are free.