Many big-brand security products fail to spot commonly-circulating malware, testing outfit has Virus Bulletin found in its latest tests.
A total of 17 out of 32 of anti-virus products failed the company's stringent VB100 test, which expects software to detect 100 percent of the commonly-circulating 'WildList' thrown at it without signalling any false positives.
Programs failing included those from Sophos, Kaspersky, Fortinet, Trend Micro, CA Home, and PC Tools, though within this group detection failures varied widely. CA's Home program scored a disturbingly high 40 misses, while the others scored from 8 misses down to only one miss for Kaspersky. PC Tools' Spyware Doctor detected the WildList suite but failed because it falsely identified two files as malware.
The worst performer on test was the relative unknown, Kingsoft AntiVirus, which missed large numbers of malware types, including 120 examples from the WildList, and over 80 percent of the worms and bots it was tested against.
"It was a shock and a concern to see such a poor performance from so many products in this latest round of testing," said John Hawes of Virus Bulletin.
"It is particularly disappointing to see so many major products missing significant real-world threats. In these days of hourly updates computer users really ought to be able to rely on their chosen security vendors for full protection against known threats."
The tests were run on Windows 2000 using a variety of worms, viruses, bots, and polymorphic malware though the company said it rated the issues as being independent of platform. A program failing to spot a particular piece of malware on one platform would be unlikely to spot it running on another, such as XP, because the detection system would be the same.
"Once the products are up and running, the detection engines should in much operate the same way on all systems - we use the default settings applied by the products," said another company source.
"The main problem here was with some particularly tricky polymorphic viruses listed as 'In the Wild' by the WildList organisation, with many products detecting some but not all files infected by the malware. There were also several clean files wrongly labelled as malware. Both these problems will have been repeated on XP, Vista, and probably other platforms too."
Not everyone agrees that the WildList, used by the VB100 tests, is a representative sample of real-world malware. The list excludes certain types of malware such as Trojans, backdoor rootkits. Moves are afoot to come up with a consistent set of tests for such malware based on behavioural characteristics rather than specific signatures.
Source: http://www.spaminspector.org/Antivirus/
A total of 17 out of 32 of anti-virus products failed the company's stringent VB100 test, which expects software to detect 100 percent of the commonly-circulating 'WildList' thrown at it without signalling any false positives.
Programs failing included those from Sophos, Kaspersky, Fortinet, Trend Micro, CA Home, and PC Tools, though within this group detection failures varied widely. CA's Home program scored a disturbingly high 40 misses, while the others scored from 8 misses down to only one miss for Kaspersky. PC Tools' Spyware Doctor detected the WildList suite but failed because it falsely identified two files as malware.
The worst performer on test was the relative unknown, Kingsoft AntiVirus, which missed large numbers of malware types, including 120 examples from the WildList, and over 80 percent of the worms and bots it was tested against.
"It was a shock and a concern to see such a poor performance from so many products in this latest round of testing," said John Hawes of Virus Bulletin.
"It is particularly disappointing to see so many major products missing significant real-world threats. In these days of hourly updates computer users really ought to be able to rely on their chosen security vendors for full protection against known threats."
The tests were run on Windows 2000 using a variety of worms, viruses, bots, and polymorphic malware though the company said it rated the issues as being independent of platform. A program failing to spot a particular piece of malware on one platform would be unlikely to spot it running on another, such as XP, because the detection system would be the same.
"Once the products are up and running, the detection engines should in much operate the same way on all systems - we use the default settings applied by the products," said another company source.
"The main problem here was with some particularly tricky polymorphic viruses listed as 'In the Wild' by the WildList organisation, with many products detecting some but not all files infected by the malware. There were also several clean files wrongly labelled as malware. Both these problems will have been repeated on XP, Vista, and probably other platforms too."
Not everyone agrees that the WildList, used by the VB100 tests, is a representative sample of real-world malware. The list excludes certain types of malware such as Trojans, backdoor rootkits. Moves are afoot to come up with a consistent set of tests for such malware based on behavioural characteristics rather than specific signatures.
Source: http://www.spaminspector.org/Antivirus/
No comments:
Post a Comment