Author Topic: New Virus.gr Tests! "2-16 April 2005"  (Read 21763 times)

0 Members and 1 Guest are viewing this topic.

Offline igor

  • Avast team
  • Serious Graphoman
  • *
  • Posts: 11849
    • AVAST Software
Re: New Virus.gr Tests! "2-16 April 2005"
« Reply #30 on: April 27, 2005, 09:40:43 AM »
Remember: Good or bad, there are 91202 samples... 8)

No, they are not - and that's the point Pavel (and I) was trying to explain. The samples are badly chosen, and many of them are probably not viruses at all. Why do you expect antiviruses to detect them?

Spyros

  • Guest
Re: New Virus.gr Tests! "2-16 April 2005"
« Reply #31 on: April 27, 2005, 10:09:57 AM »
Since Avast has no heuristics (or other effective proactive protections) as ... Dr.Web

Yeah, great heuristics. I scanned my system yesterday with CureIT! by Dr.Web. No real virus detected, 3 false possitives of 100% legal commercial products (by ashampoo & others)... No, thanks.

Komm

  • Guest
Re: New Virus.gr Tests! "2-16 April 2005"
« Reply #32 on: April 27, 2005, 10:44:11 AM »
Hi Igor!

I want understand why the samples are badly chosen.  The samples were chosen of 8 companies, not only one.  Statiscally, the top antiviruses, even the medium, have a small percentage of false positives.  This make me believe that efects of false positives will be reduced in the total sum.  All antivirus have false positives, and Avast too.  And that could be probably not a virus, could be detected by Avast too.

Good AND Bad, there are 91202 samples, and all we could learn with Virus.gr test. The detection rate of Avast could be higher if samples of Alwill were used, but weren´t.  Patience.  ;)
« Last Edit: April 27, 2005, 10:52:13 AM by Komm »

Pavel Baudis

  • Guest
Re: New Virus.gr Tests! "2-16 April 2005"
« Reply #33 on: April 27, 2005, 11:14:05 AM »
Hi Igor!

I want understand why the samples are badly chosen.  The samples were chosen of 8 companies, not only one.  Statiscally, the top antiviruses, even the medium, have a small percentage of false positives.  This make me believe that efects of false positives will be reduced in the total sum.  All antivirus have false positives, and Avast too.  And that could be probably not a virus, could be detected by Avast too.

Good AND Bad, there are 91202 samples, and all we could learn with Virus.gr test. The detection rate of Avast could be higher if samples of Alwill were used, but weren´t.  Patience.  ;)

I do not speak about the samples - I haven't seen them, so I am not able to say anything about them. What I was trying to explain is that the whole concept is wrong. In really serious tests the person MUST replicate the viruses by himself - and verify that the samples are able to replicate further. And this is really painful and difficult task. But then he knows exactly what is tested and the comparison has some value. I agree that among those samples there is a lot of working viruses. Other files could be damaged, false alarms etc. and NOBODY is able to say how valid is the test bed. So the value of such tests is *VERY* low...

Pavel

TAP

  • Guest
Re: New Virus.gr Tests! "2-16 April 2005"
« Reply #34 on: April 27, 2005, 11:54:19 AM »
Since Avast has no heuristics (or other effective proactive protections) as ... Dr.Web

Yeah, great heuristics. I scanned my system yesterday with CureIT! by Dr.Web. No real virus detected, 3 false possitives of 100% legal commercial products (by ashampoo & others)... No, thanks.

Yes, Dr.Web, NOD32 could have been great at heuristics and plus false possitive too, there's no surprise. But the fact is that every AVs (including Avast) have false possitive.

I'd just submitted a high-potentially false positive file that Avast says it's a Win32:Trojan-gen {other}yesterday.  :)

Komm

  • Guest
Re: New Virus.gr Tests! "2-16 April 2005"
« Reply #35 on: April 28, 2005, 12:14:47 AM »
I do not speak about the samples - I haven't seen them, so I am not able to say anything about them. What I was trying to explain is that the whole concept is wrong. In really serious tests the person MUST replicate the viruses by himself - and verify that the samples are able to replicate further. And this is really painful and difficult task. But then he knows exactly what is tested and the comparison has some value. I agree that among those samples there is a lot of working viruses. Other files could be damaged, false alarms etc. and NOBODY is able to say how valid is the test bed. So the value of such tests is *VERY* low...

Pavel

Hi Pavel!

I understood your viewpoint.  Virus, by concept has the capability of replication.  But like you purpose, would be nearly impossible to make a virus test.  You should work in a infected machine and able virus infect virus and their procedure wasn´t this.  I agree, it was a simplified test.  :P

But, even they not made a total test, we could learn lessons with their work.  They tested a machine for detect a virus for the first time, without execution of the virus sample (that´s i think).  That is the best way to fight with a virus in nowadays.  Detect it when it arrives and you will be free of trouble.  Otherwise, an infected machine would need disinfection and removal tests, and it would be time consumming.  Then i ask: Why realize extended tests if an antivirus failed in the first? Time is money, i forgive them...  8)

I expect participate soon of Avast improvement, sending bugs and other observations.  My 22 years working with computers must be serve for something at all...
See you later, Alwill Team.  ;D
« Last Edit: April 28, 2005, 07:51:35 AM by Komm »

tls66

  • Guest
Re: New Virus.gr Tests! "2-16 April 2005"
« Reply #36 on: May 05, 2005, 11:27:51 PM »
pehaps I'm flogging a dead horse here, but it was mentioned that nobody has heard of some of the top 20 antivirus products. To help demystify some of these products and put the test in even more doubt is that arcavir is a kaspersky clone, in engine anyway, not sure about def's And command antivirus by authentium is a rebaged f-prot clone that even uses the same def's but score over 3% lower, this is true with all the bitdefender clones aswell, some scored under the others when it should be close. Biggest mystery is norton av 2005 vs symantec ce, same defs, different results? Just something to chew on before taking anthony peltonis's word for it! :-\