Upload
frisksoftware
View
543
Download
1
Tags:
Embed Size (px)
DESCRIPTION
International Antivirus Testing Conference. Viorel Canja, Head of BitDefender Labs, Bitdefender.
Citation preview
Exploiting the testing system
Viorel Canja,Head of BitDefender Labs
Contents
What does the title mean ? Testing detection on wildcore Testing detection on zoo collections Retrospective detection tests Examples Feedback from the industry Q&A
What does the title mean ?
Purpose of tests: to define metrics and measure the performance of
AV products to find am approximation for the real world
performance of AV products to give feedback to AV researchers about their
products to allow the users to make an informed decision
What does the title mean ?
“Define:exploit” use or manipulate to one's advantage draw from; make good use of overwork: work excessively hard
What does the title mean ?
To use the limitations of the testing procedure to one’s advantage.
The focus is on those actions which have questionable benefits for the user.
Types of tests
detection tests on wildcore detection tests on zoo collections retrospective detection tests
Testing detection on wildcore
• What is wildcore ?“WildCore is a set of replicated virus samples that
represents the real threat to computer users.”“When a virus is reported to us by two or more
Reporters, it's a pretty good indication that the virus is out there, spreading, causing real problems to users. We consider such a virus to be 'In the Wild'.”
Testing detection on wildcore
The Wildcore samples are known to all AV companies as soon as wildcore is published.
Tests are likely to be performed on exactly the same samples. This is always the case with samples of malware which does not replicate.
Testing detection on wildcore
Quick hack: just sign all the samples with dumb ( aka automatic ) signatures.
Disable heuristics to avoid false positives ( if the testbed is already known there is no need for technology that detects previously unknown threats )
Testing detection on zoo collections
Zoo should contain a large number of files so that the statistics are as accurate as possible
Threats should be replicated ( where applicable ) or large numbers of samples should be used for polymorphic malware or malware that is re-generated on the server
The zoo should not contain garbage
Testing detection on zoo collections
Hacks:- use customized settings for the test. Heuristics
should be set to paranoid mode. Automatically sign all previously missed samples and white-list all previously reported false positives.
- automatically sign all samples detected by at least one AV product just to be on the “safe” side
Testing detection on zoo collections
Hacks (2):
- add detection routines for garbage that is usually found in collections. This includes detecting known false positives of other products, detecting damaged executables, detecting files produced by different analysis tools.
Retrospective detection tests
Signature databases are frozen at a certain moment Detection is tested against samples received after
that moment Testing should be done with default settings because
most of the products are marketed as “install and forget” and the majority of users will not change the settings
Retrospective detection tests
Has the disadvantage that it will not take into account proactive detections introduced by generic routines created for malware families that appear after the signatures are frozen
These routines ( or signatures ) will detect proactively subsequent variants of the same family
Favors aggressive heuristics if not correlated with false positive tests
Examples
Automatic signing: Av01 (1st pair) : TR/Zapchast.CP Av02 : Collected.Z Av03: W32/KillAV.3B84!tr Av04: Trojan.Downloader.Asks Av05: Program:Win32/SpySheriff (threat-c) Av06: Trojan.Gen Av07 : Win32:Trojan-gen. {Other} Av08: Win32/Dewnuttin.B Av09: W32/Tofger.CD Av10: Application/KillApp.A Av11: (2nd pair) TROJ_PROCKILL.DJ Av12: Trojan.Xtssksastsm Av13: (1st pair) Trojan.Win32.Zapchast.cp Av14: (2nd pair) application ProcKill-DJ Av15: Win32/ProcKill.1hj!Trojan Av16: Trojan.Zapchast.CT
Examples
Detecting other products’ false positives:
Av01: Backdoor.X
Av02: FalseAlarm.Av01.Backdoor.X
Feedback from the industry
Automatic sample processing …- is a must given the number of samples received
Feedback from the industry
… and adding detection based on the output of other AVs
- illegal, immoral, plain wrong- bad idea- it’s common practice - it probably started as an attempt to have common
names- there is no other way
Feedback from the industry
Reporting packed files- if they are not malicious we should not detect them- some of the packers should be blacklisted while
others are too widely used so must be allowed- an unfortunate necessity- professional companies do not need to use dodgy
packers
Feedback from the industry
White-listing clean apps instead of black-listing malware
- it’s not possible- does not scale- it’s ok in controlled environments- better and better idea as time passes
The end …
Q&A