240 likes | 440 Views
Efficient Detection of Split Personalities in Malware. Davide Balzarotti, Marco Cova, Christoph Karlberger, Christopher Kruegel, Engin Kirda and Giovanni Vigna NDSS 2011 Feb. OUTLINE. Introduction and Related Work Our Approach Implementation Evaluation Conclusion. Introduction.
E N D
Efficient Detection of Split Personalities in Malware Davide Balzarotti, Marco Cova, Christoph Karlberger, Christopher Kruegel, Engin Kirda and Giovanni Vigna NDSS 2011 Feb.
OUTLINE • Introduction and Related Work • Our Approach • Implementation • Evaluation • Conclusion
Introduction • Malware detection • Static • Dynamic • Sandboxes(Anubis, CWSandbox, Joebox, Norman Sandbox) • Counterattack • Attacks on Virtual Machine Emulators • CPU semantics, timing attacks • Environment attacks • Processes, drivers, or registry values
Solu1:Transparent malware analysis • Cobra • Code blocks • Replace instruction with a safe version • Ether • Hardware virtualization More difficult to detect by malicious code. Great, but slow.
Solu2:Detect different behaves • “Emulating Emulation-Resistant Malware”, 2009 • Reference system vs. emulated environment • Compare execution path • Use Ether to produce the reference trace • But executing the same program twice can lead to different execution runs.
OUTLINE • Introduction and Related Work • Our Approach • Implementation • Evaluation • Conclusion
Our approach • Recording and Replaying • Reference system vs. emulated environment • system call trace: types and arguments • If there is a different behavior • Rerun it in a transparent framework(Ether) • Detect malware reliably and efficiently
Reliability • Two systems are execution-equivalence if all program that • Start from the same initial state • Same inputs on both systems => Same runtime behavior => Same sequence of the system calls? • Assume no race condition
Reliability(cont.) • If our reference system and the analysis system are execution-equivalence, any difference in the observed behavior => split-personality • Also, this discrepancy is the result of CPU semantics or timing attacks
Making Systems Execution-Equivalence • Same OS environment • Same address space layout of a process at load time • Same inputs to a program • Run program on the reference system in log mode • Run program on the analysis system in replay mode • System call matching
ReplayProblem • A number of system calls are not safe to replay • Allocating memory, spawning threads • Only replay for those system calls that read data from the environment • other system calls are passed directly to the underlying OS • Delay cause additional system calls • WaitForSingleObject()
System Call Matching System Calls 3 4 1 2 reference analysis 1 3 2 5 Buf_skipped Buf_extra
OUTLINE • Introduction and Related Work • Our Approach • Implementation • Evaluation • Conclusion
Implementation • A kernel driver • Trap all the system calls • Hook “System Service Descriptor Table” • Each system call, two handler, log and replay • A user-space application • Start and control the driver • Start the process that has to be analyzed • Store the data generated during the logging phase
Practical aspects • Handles consistency • Live handles and replayed handles • Check a list of all replayed handles • Networking • NtDeviceIOControlFile() • Device-dependent parameters
Practical aspects(cont.) • Deferred results • STATUS_PENDING • NtWaitForSingleObject() • Thread Management • NtCreateThread() • Each thread has a new log
Limitations • Memory Mapped Files • DLLs • Create file with memory-mapped • Remove the system calls • Multiple processes • Random numbers • KsecDD • Inter-process communication and asynchronous calls • Postponing check
OUTLINE • Introduction and Related Work • Our Approach • Implementation • Evaluation • Conclusion
Evaluation • Microsoft Windows XP Service Pack 3 • VMware virtual machine • Anubis system(Qemu) • 1. Log and Replay six programs(success) • 2. SDBot(fail) • spawning new process, like NtCreateProcess • six different versions to detect VMware(success) • Red Pill, Scoopy, VMDetect, and SourPill
Evaluation(cont.) • 3. Real Malware with no VM-checks
Evaluation(cont.) • 4. Real malware with VM-checks
Performance • Depends on the type of operation • Average 1% overhead • Compresses a 1KB-long random file CMD: 7za.exe a test.zip 1KB_rand_file • Anubis: 4.267 sec • Ether: 77.325 sec • Our Vmware reference system: 1.640 sec
OUTLINE • Introduction and Related Work • Our Approach • Implementation • Evaluation • Conclusion
Conclusion • A prototype • Recording system calls and replay them • Need a fully transparent, analysis system for further examination