Author Topic: AV-Comparatives File Detection Test September 2014 - avast scores worst!?  (Read 2317 times)

0 Members and 1 Guest are viewing this topic.

REDACTED

  • Guest
Hello,

although avast has afaik never been one of the top performers in the av-comparatives file detection tests, I was really surprised and disappointed by the last results from september 2014:
http://www.av-comparatives.org/wp-content/uploads/2014/10/avc_fdt_201409_en.pdf

Although avast scored acceptable 98,6% (worse than AVIRA, but better than AVG), it was only rated with the worst rating "TESTED" due to totally inacceptable 120 false positives! Thats four times more than the second worst result in the test roundup (36 false positives). Really bad. How could that happen?

Best regards

Stefan

Offline Pondus

  • Probably Bot
  • ****
  • Posts: 37534
  • Not a avast user
If you had searched the forum you would have found a almost exact copy of this topic already posted    ;)
It is usually posted the day after these tests are done....... 


Offline RejZoR

  • Polymorphic Sheep
  • Serious Graphoman
  • *****
  • Posts: 9406
  • We are supersheep, resistance is futile!
    • RejZoR's Flock of Sheep
I frankly don't care about false positives they mention, because the way how they apparently test them is utterly stupid. In real-world test avast only had 1 false positive in the entire year 2014 and in file detection it had 100+ in just last 2 tests. How can it be best and worst at the exact same thing. It makes zero sense. That's like saying Volvo is the safest and least safe car at the same time. Make up your mind is what i'd say to AV-C... And it doesn't matter what testing methodology they used, results can't be contradictive for the very same thing.
Visit my webpage Angry Sheep Blog

REDACTED

  • Guest
If you had searched the forum you would have found a almost exact copy of this topic already posted    ;)
It is usually posted the day after these tests are done.......
I had searched about the this topic, but only found a topic regarding the real-orld AV-C results - maybe I used the wrong search term...?

I frankly don't care about false positives they mention, because the way how they apparently test them is utterly stupid. In real-world test avast only had 1 false positive in the entire year 2014 and in file detection it had 100+ in just last 2 tests. How can it be best and worst at the exact same thing. It makes zero sense. That's like saying Volvo is the safest and least safe car at the same time. Make up your mind is what i'd say to AV-C... And it doesn't matter what testing methodology they used, results can't be contradictive for the very same thing.
I'm not that deep into the test criteria of AV-C, but nevertheless, the question is valid why there have been 120 false positives where there should have been none. How important they are for the real world results can be discussed of course, but shouldn't it be investigated what lead to this false positives in order to avoid them in future versions?

Offline RejZoR

  • Polymorphic Sheep
  • Serious Graphoman
  • *****
  • Posts: 9406
  • We are supersheep, resistance is futile!
    • RejZoR's Flock of Sheep
I was talking about the actual Real-world test performed by AV-C. Tney actually call it that way...
Visit my webpage Angry Sheep Blog

Offline Pondus

  • Probably Bot
  • ****
  • Posts: 37534
  • Not a avast user
Quote
  I had searched about the this topic, but only found a topic regarding the real-orld AV-C results - maybe I used the wrong search term...?     
Here is one   https://forum.avast.com/index.php?topic=156946.msg1135241#msg1135241