new results from AV-Test.org (Q1/2008)

Discussion in 'other anti-virus software' started by Valentin_Pletzer, Jan 22, 2008.

Thread Status:
Not open for further replies.
  1. Valentin_Pletzer

    Valentin_Pletzer Registered Member

    Joined:
    Jun 19, 2007
    Posts:
    11
    Hi guys!

    I just wanted to let you know that Andreas Marx was kind enough to provide me with his newest test results. He is currently in Bilbao, Spain at the Anti-Malware Task Force Meeting.

    I published the results in Blog (in german) http://blog.chip.de/0-security-blog/security-suiten-2008-im-test-q12008-20080122/

    If you have any questions, please feel free to leave a comment beneath the blog-entry.

    Greetings from Munich
    Valentin
     
  2. jrmhng

    jrmhng Registered Member

    Joined:
    Nov 4, 2007
    Posts:
    1,268
    Location:
    Australia
    Interesting. Not too many surprises there. A few things I noted though,

    1) Why does command do worse than fprot when they are using the same engine?
    2) Clam is improving especially it is only signature based
    3) Microsoft is also improving (though other tests have already shown it has improved a fair bit since onecare v1). Seems to be very strong signature detection but low heuristics.
    4) Eset strong on heuristics but not as good on signature scanning (does that surprise anyone?:p)
    5) Just find the fact that VET is on the top of the false positive list and bottom of the detection rate sadistically funny.
     
  3. Blackcat

    Blackcat Registered Member

    Joined:
    Nov 22, 2002
    Posts:
    4,024
    Location:
    Christchurch, UK
    They are not. CSAV is still using the old 3 engine, so is equivalent to F-Prot 3. In contrast FPAV 6 is tested here and as shown has a much higher detection rate.
     
  4. Sputnik

    Sputnik Registered Member

    Joined:
    Feb 24, 2005
    Posts:
    1,198
    Location:
    Москва
    Nice, thanks a lot for posting. Personally I'm very pleased to see the peformance of avast!, their huge signature additions are paying off. Also TrendMicro is in the detection elevator, best detection of the top 3 brands (Symantec, McAfee, TrendMicro)!
     
  5. trjam

    trjam Registered Member

    Joined:
    Aug 18, 2006
    Posts:
    9,102
    Location:
    North Carolina USA
    English Translation:

    That is really a beautiful surprise. In my p.o. box nevertheless actually just the all-newest virus scanner test results arrived. The results come directly of Andreas's Marx and its test laboratory AV test those do not want I you naturally not to withhold. Security Suiten conditions 7 January 2008 under Windows XP SP2 (English) were tested. With all products it concerns the optimum version (not however the beta) The test categories read as follows: - signature-based test of 1 million Malware Samples from the last 6 months (thus no outdated viruses) - False positive test with 65,000 clean files - pro-active recognition with: + 3.500 samples in retrospective test (the signatures are not called one week updated and it looked which new Samples be still recognized now) + 20 active Samples for the behavior-based test - response times (based on 55 Samples in the year 2007) - root kit recognition (12 active Samples) First once the total valuation:
     
  6. dawgg

    dawgg Registered Member

    Joined:
    Jun 18, 2006
    Posts:
    818
    I'm a little lost... can someone please inform me; If there are 1 million malware samples used, why do some AVs detect more than a million?

    I'm surprised Avast done so well and Antivir had so few FPs.
    Not surprised Antivir and Kaspersky have amongst the fastest response times.
    Suprised with WebWasher getting only 2 FPs
    Wouldnt have expected AntiVir to get + for Proactive Detection and F-Secure to get ++
     
    Last edited: Jan 22, 2008
  7. xandros

    xandros Registered Member

    Joined:
    Oct 30, 2006
    Posts:
    411
    good job avira antivir & avast

    i read many things about antivir many sites and its excellent
     
  8. Stijnson

    Stijnson Registered Member

    Joined:
    Nov 7, 2007
    Posts:
    533
    Location:
    Paranoia Heaven
    I'm a bit technically challenged, so can someone explain what this means?
     
  9. Steel

    Steel Registered Member

    Joined:
    Jul 21, 2005
    Posts:
    219
    The results of NOD in all Categories frighten me much. Whats happens here ? :eek:
     
  10. C.S.J

    C.S.J Massive Poster

    Joined:
    Oct 16, 2006
    Posts:
    5,029
    Location:
    this forum is biased!
    Only 2 fps from drweb?

    Am I reading this right?:eek:
     
  11. Xenophobe

    Xenophobe Registered Member

    Joined:
    May 26, 2007
    Posts:
    174
    Eset did a poor job of detecting threats with signatures (which are issued in daily updates) and good in heuristics, which is a method to detect possible viruses.
     
  12. Stijnson

    Stijnson Registered Member

    Joined:
    Nov 7, 2007
    Posts:
    533
    Location:
    Paranoia Heaven
    Hmmm, okay. Thanks Xenephobe.

    I also see Symantec in the list. Does anyone know which version has been tested (where can I find this)? Is this the same as a Corporate version?
     
  13. solcroft

    solcroft Registered Member

    Joined:
    Jun 1, 2006
    Posts:
    1,639
    Not good enough to help its overall detection score, unfortunately. Do you mean to say that the testers turned off Eset's heuristics for this test?
     
  14. Paul Wilders

    Paul Wilders Administrator

    Joined:
    Jul 1, 2001
    Posts:
    12,475
    Location:
    The Netherlands
    Some vital information is missing as far as I'm concerned: no info concerning the testbed used as for the signature test for example. Is plain adware included for example? Smart people can come up with more questions like that I' sure ;)

    All in all, personally I'd like to see far more info about the test conditions before jumping to a conclusion.

    That said: for the moment although lacking needed info: congrats to the ones who did score very well.

    regards,

    paul
     
  15. Paul Wilders

    Paul Wilders Administrator

    Joined:
    Jul 1, 2001
    Posts:
    12,475
    Location:
    The Netherlands
    ...and there's the first smart question ;) Has been tested out-of-the box, has there been tested after tweaking?

    Keep them coming those questions, ladies and gents! ;)
     
  16. Stijnson

    Stijnson Registered Member

    Joined:
    Nov 7, 2007
    Posts:
    533
    Location:
    Paranoia Heaven
    What I find a bit strange is that NOD32 always scores lower in AV-Test.org tests compared to the AV Comparatives'...
    I guess it's also a matter of how things are being tested. I do hope these AV-Test results will be expanded with version numbers of the specified products though. Those seem to be missing.
     
  17. Dieselman

    Dieselman Registered Member

    Joined:
    Jan 6, 2008
    Posts:
    795
    Doesnt make me feel good about spending $40 on NOD32. Should have kept Avast for free. :'(
     
  18. Paul Wilders

    Paul Wilders Administrator

    Joined:
    Jul 1, 2001
    Posts:
    12,475
    Location:
    The Netherlands
    Bolded part: Bingo! Plus: what sort of samples have been tested?

    regards,

    Paul
     
  19. Paul Wilders

    Paul Wilders Administrator

    Joined:
    Jul 1, 2001
    Posts:
    12,475
    Location:
    The Netherlands
    One should not jump to conclusions without knowing all the needed facts. And this does not in particular goes for NOD32, but for all Antiviruses tested ;) .

    regards,

    paul
     
  20. aigle

    aigle Registered Member

    Joined:
    Dec 14, 2005
    Posts:
    11,167
    Location:
    UK / Pakistan
    Overall detection of NOD 32 is not good though it has very good heuristics.
    They must add a lot of signatures like Avira and others.
     
  21. ASpace

    ASpace Guest


    But is there anyone here who can answer such questions , Paul ? ;)
     
  22. Valentin_Pletzer

    Valentin_Pletzer Registered Member

    Joined:
    Jun 19, 2007
    Posts:
    11
    Hi Paul,

    to make things easier. Here is the original e-mail from Andreas:

    All products (in the "best" available Security Suite edition) were last updated on January 7, 2008 and tested on Windows XP SP2 (English).

    First, we checked the signature-based on-demand detection of all products against more than 1 Mio. samples we've found spreading or which were distributed during the last six months (this means, we have not used any "historic" samples.) We included all malware categories in the test: Trojan Horses, backdoors, bots, worm and viruses. Instead of just presenting the results, we have ranked the product this time, from "very good" (++) if the scanner detected more than 98% of the samples to "poor" (--) when less than 85% of the malware was detected.

    Secondly, we checked the number of false positives of the products have generated during a scan of 65,000 known clean files. Only products with no false positives received a "very good" (++) rating.

    In case of the proactive detection category, we have not only focussed on signature- and heuristic-based proactive detection only (based on a retrospective test approach with a one week old scanner).
    Instead of this, we also checked the quality of the included behaviour based guard (e.g. Deepguard in case of F-Secure and TruPrevent in case of Panda). We used 3,500 samples for the retrospective test as well as 20 active samples for the test of the "Dynamic Detection" (and blocking) of malware.

    Furthermore, we checked how long AV companies usually need to react in case of new, widespread malware (read: outbreaks), based on 55 different samples from the entire year 2007. "Very good" (++) AV product developers should be able to react within less than two hours.

    Another interesting test was the detection of active rootkit samples.
    While it's trivial for a scanner to detect inactive rootkits using a signature, it can be really tricky to detect this nasty malware when they are active and hidden. We checked the scanner's detection against 12 active rootkits.


    regards
    Valentin
     
  23. C.S.J

    C.S.J Massive Poster

    Joined:
    Oct 16, 2006
    Posts:
    5,029
    Location:
    this forum is biased!
    These massive tests are interesting at best.

    Over 1 million new threats in the last 6 months I find extremely hard to believe

    just how many of these are real threats that are circling around?

    so I wouldn't worry Paul, about your beloved nod32. ( especially not on these huge tests anyway )

    ;)
     
  24. MalwareDie

    MalwareDie Registered Member

    Joined:
    Dec 8, 2006
    Posts:
    500
    65 000 is quite a small number compared to av-comparatives' number of at least 10 million.
     
  25. Brian N

    Brian N Registered Member

    Joined:
    Jul 7, 2005
    Posts:
    2,174
    Location:
    Denmark
    I've never seen 10mil in a test at av-comp but whatever, Antivir is kicking ass.
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.