300 likes | 433 Views
Entropy and Malware Detection ITEC808 – Final Project Presentation. Vithushan Sivalingam Student No: 42413753 Supervisors: Prof. Vijay Varadharanjan & Dr Udaya Tupakula 11 th November 2011. Contents. Introduction Project Aims Shannon’s Entropy Review Malware
E N D
Entropy and Malware DetectionITEC808 – Final Project Presentation Vithushan Sivalingam Student No: 42413753 Supervisors: Prof. Vijay Varadharanjan & Dr Udaya Tupakula 11th November 2011
Contents • Introduction • Project Aims • Shannon’s Entropy Review • Malware • Entropy techniques with malware • Analysis of the schemes • Discussion • Conclusion • Future Works
Introduction • Entropy quantifies the uncertainty involved in predicting the value of a random variable. • The outcome of a fair coin flip (two equally likely outcomes)provides less information (lower entropy) than specifying the outcome from a roll of a dice (six equally likely outcomes). • In real world, most collections of data give the false information somewhere in between. • False Positive - Couldn’t be Identify the software to be malicious, but missed it and it is a malicious. • False Negative - Identify the software be malicious, but it doesn’t turn out to be.
Malware detection plays a significant role in protecting against attacks launched on a communication world. • Still malware detection tools cannot fully prevent against encrypted and packed malwares. • Explore improvement of malware detection through entropy techniques.
Project Aims • The main goal of this project was to investigate the development of suitable entropy techniques to detect malware. • ITEC808 Literature View Component are: • Reviewing the Shannon’s entropy method. • Identifying of malware attributes and functionality. • Detailed understanding of entropy techniques and malware detection. • Study of entropy based malware detection schemes. • Analysing and reasoning about the efficiency of the proposed schemes.
Problems and Significance • Understanding the entropy theorem. • Malware Growth & Identifying attributes and functionality. • Understanding on statistical variation in malware executables.
Investigate the development of suitable entropy techniques to detect malware. • Which could be helpful for security analysts to identify more efficiently malware samples (packed or encrypted).
Shannon’s Entropy Review • Point to Point Communication. • Given two random variables, what can we say about one when we know the other? This is the central problem in information theory. • Keywords : Choice, Uncertain and Entropy
The entropy of a random variable X is defined by • X- information source • The entropy is non-negative. It is zero when the random variable is “certain” to be predicted.
Fair distribution • Flip Coin {0.5,0.5} • H(x) = + ≈ 1 bit (Receive 1 bit of information) • Double headed {1} • H(x) = = 0 bit • Unfair Coin {0.75,0.25} • H(x) = + ≈ 0.811 bit Known distribution Unfair distribution
H(X) Bits Probability • Fair distribution entropy reached the highest level (1 bit) • Known distribution, entropy getting 0 bits of information. ( P = 1 or 0) • Unfair distribution, the entropy lower than maximum. (not balanced)
Joint Entropy • For two random variables X and Y , the joint entropy is defined by • H(X, Y) = • Conditional entropy • Between two random variables X and Y are dependent. The extra information X contains ones Y disclosed. • Continue with chain of entropy rules.
Entropy Mutual Information (Information Gain) • H(X) - H(X|Y) = H(Y) - H (Y|X) • H(X,Y) = H(X) +H(Y) (Independent) • H(X,Y) < H(X) +H(Y) (dependent) • H(X,Y) = H(X) + H(Y|X) = H(Y) + H (X|Y) • These entropy techniques helps to build the detection models. Joint Entropy Conditional Entropy
Malware • Malware labelled by its attributes, behaviours and attack patterns.
. • Reported that among 20, 000 malware samples more than 80% were packed by packers from 150 different families. • If the malware, modified in runtime encryption or compression, known as a packed malware. • This process compresses an executable file and modifies the file containing the code to decompress it at runtime
Packed executable is built with two main parts. • Initially, the original executable is compressed and kept in a packed executable as a file. • Secondly, a decompression section is added to the packed executable. (This section is used to reinstall the main executable. )
Entropy techniques with malware • Entropy of packed information is higher than the original information. • Information is reduced by compression and a series of bits becomes more unpredictable, which is equivalent to uncertainty. • Packed Information • Uncertainty Information Entropy • Original Information. • Uncertainty Information Entropy • False alarms play a big role. • Possible that legitimately compressed and encrypted files could trigger false positives.
But we can use entropy to determine whether it’s an anomaly or not. • Establish categories based on different entropies. • If entropy over a threshold then we can categories to be malicious and below that value all being not malicious. • That means, we can use the entropy as a measure to classify the software to be malware.
Analysis of the schemes • In the Information-theoretic Measures for Anomaly Detection. • Objective • Provide theoretical foundation as well as useful tools that can facilitate the IDS development process and improve the effectiveness of ID technologies. • Experiments on • University of New Mexico (UNM) sendmail system call data • MIT Lincoln Lab sendmail BSM data • MIT Lincoln Lab tcpdump data
Approach: • Entropy and conditional entropy: regularity • Determine how to build a model. • Joint (conditional) entropy: how the regularities between training and test datasets relate • Determine the performance of a model on test data. • A classification approach: • Given the first k system calls, predict the k+1thsystem call
Conditional Entropy of Training Data (UNM) • More information is included, the more regular the dataset.
Misclassification Rate: Training Data • Misclassification means that the classification process classifies an item to be in class A while the actual class is B. • The misclassification rate is used to measure anomaly detection performance.
Conditional Entropy vs. Misclassification Rate • The movement of misclassification rate coincides with the movement of conditional entropy. • Estimated movement of misclassification rate, to select a sequence length for the detection model. • E.g. Length 6 is better than 4, and 14 is better than 6.
Misclassification Rate of Testing Data and Intrusion Data • Misclassification rate used as a indicator to determine whether it is an abnormal trace or normal trace .
Other Schemes Objectives • “Unpacking using Entropy Analysis” analysis, how to use entropy to quickly and efficiently identify packed or encrypted malware executable and offer results from testing methodology. • - bintropy technique • “Estimation for real-time encrypted traffic identification” analysis Entropy and describes a novel approach to classify network traffic into encrypted and unencrypted traffic. • real-time encrypted traffic detector (RTETD) • The classifier is able to operate in real-time as only the first packet of each flow is processed • Used encrypted Skype traffic
Discussion • Through studying the schemes and information theory I was able to find the follows. • Entropy can be used to measure the regularity of reviewing datasets of mixture of records. • Conditional entropy can be used to measure the regularity on sequential dependencies of reviewing datasets of structured records. • Relative entropy can be used to measure the relationship between the regularity (consistency) measures of two datasets. • Information gain used to categorise the classifying data items.
Conclusion • Review and Analyse of Shannon’s entropy study, with Examples. • Research and identification of malware (packed) functionalities with characteristics and attributes. • Analysis of entropy based schemes. • These significant findings will be following up in future work.
Involving on the Investigation of entropy analysis for selected software samples. • Use the entropy techniques to compute the entropy scores from the selected malware executable samples. • Identify the experimental tools. • We planed to analysis the malware samples using commercial experiments tools. E.g. PECompact Executable Compressor
Reference • C. E. Shannon. The Mathematical Theory of Communication. Reprinted with corrections from The Bell System Technical Journal, Vol. 27, pp. 379–423, 623–656, July, October, 1948. • M. Morgenstern and A. Marx. Runtime packer testing experiences. In Proceedings of the 2nd International CARO Workshop, 2008. • *Lee, W., Xiang, D.: Information-theoretic Measures for Anomaly Detection. In: IEEE Symp. On Security and Privacy, Oakland, CA, pp. 130-143 (2001). • M. Morgenstern and HendrikPilz, AV-Test GmbH, Magdeburg, Germany, Useful and useless statistics about viruses and anti-virus programs, Presented at CARO 2010 Helsinki. • *Lyda, R., Hamrock, J.: Using Entropy Analysis to Find Encrypted and Packed Malware. In: Security & Privacy, IEEE Volume 5, Issue 2, pp. 40-45, Digital Object Identifier 10.1109/MSP.2007.48 (March-April 2007). • GuhyeonJeong, EuijinChoo, Joosuk Lee, Munkhbayar Bat-Erdene, and Heejo Lee Generic, Unpacking using Entropy Analysis, Div. of Computer & Communication Engineering, Korea University, Seoul, Republic of Korea, 2010. • *Peter Dorfinger, Georg Panholzer, and Wolfgang John: Entropy estimation for real-time encrypted traffic identification: Salzburg Research, Salzburg, Austria, 2010.