Various automated techniques are described herein for the runtime detection/neutralization of malware executing on a computing device. The foregoing is achievable during a relatively early phase, for example, before the malware manages to encrypt any of the user's files. For instance, a malicious process detector may create decoy file(s) in a directory. The decoy file(s) may have attributes that cause such file(s) to reside at the beginning and/or end of a file list. By doing so, a malicious process targeting files in the directory will attempt to encrypt the decoy file(s) before any other file. The detector monitors operations to the decoy file(s) to determine whether a malicious process is active on the user's computing device. In response to determining that a malicious process is active, the malicious process detector takes protective measure(s) to neutralize the malicious process.
|
1. A method for malware prevention performed by a computing device, comprising:
creating one or more decoy files in a file directory that stores one or more other non- decoy files before the file directory is read by a malicious process;
determining that one or more file access operations are being performed with respect to at least one of the one or more decoy files;
analyzing the one or more file access operations to determine whether the one or more file access operations originate from the malicious process; and
in response to determining that the one or more file access operations originate from the malicious process, performing an action to neutralize the malicious process.
17. A computer-readable storage medium having program instructions recorded thereon that, when executed by a processing device, perform a method for detecting a malicious process, the method comprising:
creating one or more decoy files in a file directory that stores one or more other non- decoy files before the file directory is read by the malicious process;
determining that one or more file access operations are being performed with respect to at least one of the one or more decoy files;
analyzing the one or more file access operations to determine whether the one or more file access operations originate from the malicious process; and
in response to determining that the one or more file access operations originate from the malicious process, performing an action to neutralize the malicious process.
9. A system, comprising:
one or more processors; and
a memory coupled to the one or more processors, the memory storing instructions, which, when executed by the one or more processors, cause the one or more processors to perform operations, the operations comprising:
creating one or more decoy files in a file directory that stores one or more other non-decoy files before the file directory is read by a malicious process;
determining that one or more file access operations are being performed with respect to at least one of the one or more decoy files;
analyzing the one or more file access operations to determine whether the one or more file access operations originate from the malicious process; and
in response to determining that the one or more file access operations originate from the malicious process, performing an action to neutralize the malicious process.
2. The method of
terminating the malicious process;
suspending the malicious process;
performing a backup of the one or more other non-decoy files stored in the file directory;
checking an integrity of the one or more other non-decoy files;
activating an anti-virus program;
recording in an event log an event that indicates that the malicious process performed the one or more file access operations to the one or more decoy files; or
prompting a user of the computing device to indicate an operation to perform.
3. The method of
periodically modifying one or more attributes of the one or more decoy files such that a sorting operation performed on the files stored in the directory causes the one or more decoy files to be listed before the other one or more non-decoy files in a list generated by the sorting operation.
4. The method of
a file name;
a file size;
a creation date;
a modification date;
a file type; or
file content.
5. The method of
identifying a pattern associated with the one or more file access operations that are performed with respect to the one or more decoy files; and
providing the pattern as an input to a machine-learning-based algorithm that outputs an indication of whether the pattern is a legal file access pattern or an illegal file access pattern, the machine-learning-based algorithm being trained on observed file access patterns for the one or more other non-decoy files.
6. The method of
comparing the probability to a threshold.
7. The method of
identifying a pattern associated with the one or more file access operations that are performed with respect to the one or more decoy files; and
applying one or more rules to the pattern to determine whether the one or more file access operations originate from the malicious process.
8. The method of
10. The system of
terminating the malicious process;
suspending the malicious process;
performing a backup of the one or more other non-decoy files stored in the file directory;
checking an integrity of the one or more other non-decoy files;
activating an anti-virus program;
recording in an event log an event that indicates that the malicious process performed the one or more file access operations to the one or more decoy files; or
prompting a user of the computing device to indicate an operation to perform.
11. The system of
periodically modifying one or more attributes of the one or more decoy files such that a sorting operation performed on the files stored in the directory causes the one or more decoy files to be listed before the other one or more non-decoy files in a list generated by the sorting operation.
12. The system of
a file name;
a file size;
a creation date;
a modification date;
a file type; or
file content.
13. The system of
identifying a pattern associated with the one or more file access operations that are performed with respect to the one or more decoy files; and
providing the pattern as an input to a machine-learning-based algorithm that outputs an indication of whether the pattern is a legal file access pattern or an illegal file access pattern, the machine-learning-based algorithm being trained on observed file access patterns for the one or more other non-decoy files.
14. The system of
comparing the probability to a threshold.
15. The system of
identifying a pattern associated with the one or more file access operations that are performed with respect to the one or more decoy files; and
applying one or more rules to the pattern to determine whether the one or more file access operations originate from the malicious process.
16. The system of
18. The computer-readable storage medium of
terminating the malicious process;
suspending the malicious process;
performing a backup of the one or more other non-decoy files stored in the file directory;
checking an integrity of the one or more other non-decoy files;
activating an anti-virus program;
recording in an event log an event that indicates that the malicious process performed the one or more file access operations to the one or more decoy files; or
prompting a user of the computing device to indicate an operation to perform.
19. The computer-readable storage medium of
periodically modifying one or more attributes of each of the one or more decoy files such that a sorting operation performed on the files stored in the directory causes the one or more decoy files to be listed before the other one or more non-decoy files in a list generated by the sorting process.
20. The computer-readable storage medium of
|
This application is a U.S. national phase application of PCT/IB2017/058485, filed on Dec. 28, 2017, which claims priority to U.S. Provisional Application Ser. No. 62/445,015, filed Jan. 11, 2017 (both entitled “Early Runtime Detection and Prevention of Ransomware”), the entireties of which are incorporated by reference herein.
Embodiments described herein generally relate to detecting and/or neutralizing malware or other security threats on computer systems, such as ransomware.
In recent years, ransomware has been recognized as one of the most serious cyber threats. Ransomware typically encrypts important documents on a target computer. In order to decrypt the documents, the user must pay a considerable ransom. In cases in which the targeted files have not been backed-up, security experts often advise the victim to pay the ransom because there is no effective way to restore the encrypted data.
Methods, systems, and apparatuses are described for detecting and/or neutralizing malware or other security threats on computer systems, such as ransomware, substantially as shown in and/or described herein in connection with at least one of the figures, as set forth more completely in the claims.
The accompanying drawings, which are incorporated herein and form part of the specification, illustrate embodiments of the present invention and, together with the description, further serve to explain the principles involved and to enable a person skilled in the relevant art(s) to make and use the disclosed technologies.
The features and advantages of the disclosed technologies will become more apparent from the detailed description set forth below when taken in conjunction with the drawings, in which like reference characters identify corresponding elements throughout. In the drawings, like reference numbers generally indicate identical, functionally similar, and/or structurally similar elements. The drawing in which an element first appears is indicated by the leftmost digit(s) in the corresponding reference number.
The following detailed description refers to the accompanying drawings that illustrate exemplary embodiments of the present invention. However, the scope of the present invention is not limited to these embodiments, but is instead defined by the appended claims. Thus, embodiments beyond those shown in the accompanying drawings, such as modified versions of the illustrated embodiments, may nevertheless be encompassed by the present invention.
References in the specification to “one embodiment,” “an embodiment,” “an example embodiment,” or the like, indicate that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Furthermore, when a particular feature, structure, or characteristic is described in connection with an embodiment, it is submitted that it is within the knowledge of one skilled in the art to implement such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described.
Numerous exemplary embodiments are now described. The section/subsection headings utilized herein are not intended to be limiting. Embodiments are described throughout this document, and any type of embodiment may be included under any section/subsection. Furthermore, it is contemplated that the disclosed embodiments may be combined with each other in any manner.
Conventional anti-malware programs do not offer an effective systematic method for handling ransomware. Prior techniques have tried to address the problem by various back-up methods and static signature-based detection of ransomware files by antivirus utilities or similar facilities. However, such techniques still result in at least a portion of the user's documents to be encrypted. Further steps should be taken to detect ransomware before any user documents are encrypted.
In particular, a method for malware prevention performed by a computing device is described herein. In accordance with the method, one or more decoy files in a file directory that stores one or more other files are created. A determination is made that one or more file access operations are being performed with respect to at least one of the one or more decoy files. The one or more file access operations are analyzed to determine whether the one or more file access operations originate from a malicious process. In response to determining that the one or more file access operations originate from the malicious process, an action is performed to neutralize the malicious process.
In accordance with one or more embodiments, the performing step comprises at least one of terminating the malicious process, suspending the malicious process, performing a backup of the one or more other files stored in the file directory, checking an integrity of the one or more other files, activating an anti-virus program, recording in an event log an event that indicates that the malicious process performed the one or more file access operations to the one or more decoy files, or prompting a user of the computing device to indicate an operation to perform.
In accordance with one or more embodiments, the method further comprises periodically modifying one or more attributes of the one or more decoy files such that a sorting operation performed on the files stored in the directory causes the one or more decoy files to be listed before the other one or more files in a list generated by the sorting operation.
In accordance with one or more embodiments, the one or more attributes comprise at least one of a file name, a file size, a creation, or a modification date.
In accordance with one or more embodiments, the analyzing step comprises identifying a pattern associated with the one or more file access operations that are performed with respect to the one or more decoy files and providing the pattern as an input to a machine-learning-based algorithm that outputs an indication of whether the pattern is a legal file access pattern or an illegal file access pattern, the machine-learning-based algorithm being trained on observed file access patterns for the one or more other files.
In accordance with one or more embodiments, the machine-learning based algorithm outputs a probability that the pattern is a legal file access pattern and the analyzing step further comprises comparing the probability to a threshold.
In accordance with one or more embodiments, the analyzing step comprises identifying a pattern associated with the one or more file access operations that are performed with respect to the one or more decoy files and applying one or more rules to the pattern to determine whether the one or more file access operations originate from the malicious process.
In accordance with one or more embodiments, the pattern associated with the one or more file access operations comprises a read operation to the decoy file or to a portion thereof and a write operation to the same decoy file or the same portion thereof.
A system is also described herein. The system includes one or more processors and a memory coupled to the one or more processors, the memory storing instructions, which, when executed by one or more processors, cause the one or more processors to perform operations. In accordance with the operations, one or more decoy files in a file directory that stores one or more other files are created. A determination is made that one or more file access operations are being performed with respect to at least one of the one or more decoy files. The one or more file access operations are analyzed to determine whether the one or more file access operations originate from a malicious process. In response to determining that the one or more file access operations originate from the malicious process, an action is performed to neutralize the malicious process.
In accordance with one or more embodiments, the performing step comprises at least one of terminating the malicious process, suspending the malicious process, performing a backup of the one or more other files stored in the file directory, checking an integrity of the one or more other files, activating an anti-virus program, recording in an event log an event that indicates that the malicious process performed the one or more file access operations to the one or more decoy files, or prompting a user of the computing device to indicate an operation to perform.
In accordance with one or more embodiments, the operations further comprise periodically modifying one or more attributes of the one or more decoy files such that a sorting operation performed on the files stored in the directory causes the one or more decoy files to be listed before the other one or more files in a list generated by the sorting operation.
In accordance with one or more embodiments, the one or more attributes comprise at least one of a file name, a file size, a creation, or a modification date.
In accordance with one or more embodiments, the analyzing step comprises identifying a pattern associated with the one or more file access operations that are performed with respect to the one or more decoy files and providing the pattern as an input to a machine-learning-based algorithm that outputs an indication of whether the pattern is a legal file access pattern or an illegal file access pattern, the machine-learning-based algorithm being trained on observed file access patterns for the one or more other files.
In accordance with one or more embodiments, the machine-learning based algorithm outputs a probability that the pattern is a legal file access pattern and the analyzing step further comprises comparing the probability to a threshold.
In accordance with one or more embodiments, the analyzing step comprises identifying a pattern associated with the one or more file access operations that are performed with respect to the one or more decoy files and applying one or more rules to the pattern to determine whether the one or more file access operations originate from the malicious process.
In accordance with one or more embodiments, the pattern associated with the one or more file access operations comprises a read operation to the decoy file or to a portion thereof and a write operation to the same decoy file or the same portion thereof.
A computer-readable storage medium having program instructions recorded thereon that, when executed by a processing device, perform a method for detecting a malicious process is further described herein. In accordance with the method, one or more decoy files in a file directory that stores one or more other files are created. A determination is made that one or more file access operations are being performed with respect to at least one of the one or more decoy files. The one or more file access operations are analyzed to determine whether the one or more file access operations originate from a malicious process. In response to determining that the one or more file access operations originate from the malicious process, an action is performed to neutralize the malicious process.
In accordance with one or more embodiments, the performing step comprises at least one of terminating the malicious process, suspending the malicious process, performing a backup of the one or more other files stored in the file directory, checking an integrity of the one or more other files, activating an anti-virus program, recording in an event log an event that indicates that the malicious process performed the one or more file access operations to the one or more decoy files, or prompting a user of the computing device to indicate an operation to perform.
In accordance with one or more embodiments, the method further comprises periodically modifying one or more attributes of the one or more decoy files such that a sorting operation performed on the files stored in the directory causes the one or more decoy files to be listed before the other one or more files in a list generated by the sorting operation.
In accordance with one or more embodiments, the one or more attributes comprise at least one of a file name, a file size, a creation, or a modification date.
Various automated techniques are described herein for the runtime detection and/or neutralization of malware (e.g., ransomware) executing on a computing device. The foregoing may be achieved during a relatively early phase (e.g., soon after the malware begins executing), for example, before the malware manages to encrypt any of the user's files. For instance, a malicious process detector may create one or more decoy file(s) in a directory. The decoy file(s) may have attributes that cause such file(s) to reside at the beginning and/or end of a file list. By doing so, a malware process targeting files in the directory will attempt to encrypt the decoy file(s) before any other file. The malicious process detector monitors operations to the decoy file(s) to determine whether a malicious process is active on the user's computing device. In response to determining that a malicious process is active, the malicious process detector takes one or more protective measures to neutralize the malicious process. By having the malicious process intentionally target the decoy file(s) first, the risk of having important user files compromised before detection of the malware process is greatly reduced.
For the sake of brevity, embodiments described herein are described in terms of the Microsoft Windows® Operating System (OS), published by Microsoft Corporation of Redmond, Wash. However, as should be clear to any person skilled in the art, this is just one possible embodiment. Similar embodiments may protect practically all kinds of modern operating systems, including LINUX® and other UNIX® variants, against a very wide array of malicious-code attacks, whether remote or local.
For instance,
As further shown in
Memory 104 comprises one or more computer-readable memory devices that operate to store computer programs and data. Memory 104 may be implemented using any of a wide variety of hardware-based, volatile computer-readable memory devices including, but not limited to, random access memory (RAM) devices and/or non-volatile computer-readable memory devices, including but not limited to, read-only memory (ROM) devices, solid state drives, hard disk drives, magnetic storage media such as magnetic disks and associated drives, optical storage media such as optical disks and associated drives, and flash memory devices such as USB flash drives. Processor(s) 102 are connected to memory 104 via one or more suitable interfaces.
As shown further shown in
Operating system 106 may comprise a file system 108 that is operable to name, store, access and organize files. In accordance with an embodiment, file system 108 stores files, directories and information needed to locate and access such items. File system 108 may be capable of storing files to a variety of physical media (e.g., memory 104), including but not limited to one or more hard disk drives, solid state drives, optical discs, magnetic tapes, flash memory devices, or the like. For example, as shown in
Computing device 100 is configured to detect and/or neutralize malicious processes from compromising (e.g., encrypting) such file(s) 112. For example, as shown in
Malicious process detector 114 may create one or more decoy files 116 in one or more of director(ies) 110. Examples of such directories include, but are not limited to, a default documents storage directory of operating system 106, directories that contain user, documents, spreadsheets, pictures, images, or any other directory maintained by file system 108. It is noted in addition to or in lieu of file(s) 112 and decoy file(s) 116 being stored in director(ies) 110, file(s) 112 and decoy file(s) 116 may be stored in any suitable storage location and may be stored accordance with any suitable organization.
When a computing process (or “process”) reads a directory, a file list may be returned to the process that includes each of the files included therein. The file list may be sortable by any given attribute of files included therein. Such attributes include, but are not limited to, the file name, the file size, the creation date, the modification date, etc. Malicious process detector 114 may define such attribute(s) of decoy file(s) 116 in a manner that makes decoy file(s) 116 reside at the beginning and/or the end of the file list when traversed by a process (e.g., a malicious process, such as ransomware) that reads director(ies) 110. By doing so, the likelihood that the malicious process accesses decoy document(s) 116 before file(s) 112 is greatly increased, and the risk of having file(s) 112 compromised before detection of the malicious process is greatly reduced.
Malicious process detector 114 is configured to monitor operations (e.g., read operations, write operations, etc.) to decoy file(s) 116 and determine the likelihood that such operations are typical of a malicious process. In response to determining that the operations are typical of a malicious process, malicious process detector 114 may perform an action to neutralize the malicious process. Neutralization of the malicious process may include steps to terminate or suspend the malicious process, steps to mitigate the effects of the malicious process, and/or steps to facilitate the termination, suspension and/or mitigation of the malicious process (such as detecting the malicious process). For example, malicious process detector 114 may cause operating system 106 to terminate the malicious process, suspend the malicious process, perform backup of file(s) 112 stored on computing device 100 (e.g., file(s) 112, check the integrity of file(s) 112, activate an anti-virus program or other security mechanisms, write event logs, prompt the user to indicate what operation to perform, etc.
As further shown in
Decoy file(s) 216 may possess attributes that cause decoy file(s) 216 to reside at the beginning and/or the end of a file list when directory 210 is sorted thereby and/or traversed by a program (e.g., a malicious process, such as process 220) that reads directory 210. Examples of such attribute(s) include, but are not limited to, the file name, the file size, the creation date, the modification date, file type, authors, etc. For example, before creating decoy file(s) 216, decoy documents manager 202 may initially read directory 210 and determine attributes of file(s) 212. Thereafter, decoy document manager 202 may specify the attributes for decoy file(s) 216 based on the determined attributes of file(s) 212 such that decoy file(s) 216 reside at the beginning and/or the end of the file list when directory 210 is sorted and/or traversed by a malicious process (e.g., process 220).
For example, decoy documents manager 202 may determine that the first file of file(s) 212, when directory 210 is sorted alphabetically by file name, is “Family Vacation.jpeg.” To ensure that decoy file(s) 216 appear before this file, decoy documents manager 202 may designate the file names of decoy file(s) 216 to start with a letter before ‘F’, a number, or a special character (e.g., !, @, #, S, %, {circumflex over ( )}, &, etc.). Decoy documents manager 202 may also determine that the last file of file(s) 212, when directory 210 is sorted alphabetically by file name, is “Maui.jpeg.” To ensure that decoy file(s) 216 appear after this file, decoy documents manager 202 may designate the file names of decoy file(s) 216 to start with the letter ‘N’ or some other letter that comes after the letter ‘M’. Decoy documents manager 202 may create decoy file(s) 216 that reside both at the beginning and the end of the file list to ensure that decoy file(s) 216 are accessed regardless of whether a malicious process (e.g., process 220) accesses the files (e.g., file(s) 212 and decoy file(s) 216) in directory 210 by file name in ascending or descending order.
In another example, decoy documents manager 202 may determine that the first file of file(s) 212, when directory 210 is sorted chronologically by creation and/or modification date, is “Jan. 29, 2014.” To ensure that decoy file(s) 216 appear before this file, decoy documents manager 202 may designate the creation and/or modification date of decoy file(s) 216 to have a creation and/or modification date before this date. Decoy documents manager 202 may also determine that the last file of file(s) 212, when directory 210 is sorted chronologically by creation and/or modification date, is “Dec. 1, 2017.” To ensure that decoy file(s) appear after this file, decoy documents manager 202 may designate the creation and/or modification date of decoy file(s) 216 to have a creation and/or modification date after this date. Decoy documents manager 202 may create decoy file(s) 216 that reside both at the beginning and the end of the file list to ensure that decoy file(s) 216 are accessed regardless of whether a malicious process (e.g., process 220) accesses the files (e.g., file(s) 212 and decoy file(s) 216) in directory 210 by creation and/or modification date in ascending or descending order.
In yet another example, decoy documents manager 202 may determine that the first file of file(s) 212, when directory 210 is sorted by file size, is 110 KB. To ensure that decoy file(s) 216 appear before this file, decoy documents manager 202 may specify the file size of decoy file(s) 210 to be less than 110 KB, or alternatively, create a decoy file that has a file size less than 110 KB. Decoy documents manager 202 may also determine that the last file of file(s) 212, when directory 210 is sorted by file size, is 12 MB. To ensure that decoy file(s) 216 appear after this file, decoy documents manager 202 may specify the file size of decoy file(s) 216 to more than 12 MB, or alternatively, create a decoy file that has a file size of more than 12 MB. Decoy documents manager 202 may create decoy file(s) 216 that reside both at the beginning and the end of the file list to ensure that decoy file(s) 216 are accessed regardless of whether a malicious process (e.g., process 220) accesses the files (e.g., file(s) 212 and decoy file(s) 216) in directory 210 by file size in ascending or descending order.
It is noted that the attributes described above are purely exemplary, and that any attribute of decoy file(s) 216 provided by the file system maintaining directory 210 (e.g., file system 108) may be modified, including, but not limited to the content of the decoy file(s) 216, or one or more other properties of decoy file(s) 216, to ensure a desired placement of such decoy file(s) 216 at the beginning or end of a file list used for sorting and/or traversal.
Decoy documents manager 202 may be further configured to periodically modify attribute(s) of decoy file(s) 216 and/or create additional decoy files to take into account additional file(s) 212 that have been modified and/or added to director(ies) 212 over time. This is also performed to emulate a typical file system and to prevent malicious process 210 from learning which files stored in director(ies) 210 are decoy file(s) 216 and skipping such files when carrying out encryption operations. Decoy documents manager 202 provides a list of decoy file(s) 216 and their associated attributes to updateable knowledge base 224, which is described below.
Operation monitor 206 is configured to monitor decoy file(s) 216 for one or more file access operations directed to decoy file(s) 216. Examples of file access operations include, but are not limited to, an open operation, a read operation, a write operation, a copy operation, etc. In certain implementations, file access operations are issued by a process via procedure calls. In accordance with such implementations, operation monitor 206 may use hooking techniques to hook procedure calls directed to decoy file(s) 216. Examples of procedure calls that may be hooked include, but are not limited to, an NtOpenFile procedure call, an NtReadFile procedure call, an NtWriteFile procedure call, an NtCreateFile procedure call etc., each of which are procedure calls used in a Microsoft Windows®-based operating system. It is noted that the foregoing is just one technique for detecting file access operations, and that other detection techniques may be used, including, but not limiting to, using a kernel-mode component such as a file system filter driver (e.g., in a Microsoft Windows®-based environment) to detect file access operations.
In accordance with an embodiment, only decoy file(s) 216 are monitored by operation monitor 206 to reduce the computing overhead of the computing device on which malicious process detector 214 is executing, although the embodiments described herein are not so limited. For example, as described below, file(s) 212 may also be monitored by operation monitor 206.
Upon detecting file access operation(s) issued to decoy file(s) 216, operation monitor 206 may send a request to operation analyzer 208 that indicates the file access operation(s) issued to decoy file(s) 216. Operation analyzer 208 may determine whether the process that issued the file access operation(s) is a malicious process (e.g., process 220). For example, operation analyzer 208 may access updateable knowledge base 224 Updateable knowledge base 224 may comprise a data store (e.g., a database) that stores one or more decoy file identifiers that each represent a particular decoy file of decoy file(s) 216. The identifier may be the file name of the decoy file, the directory path of the decoy file, a tag representative of the decoy file and/or the like. The identifier may be provided by decoy documents manager 202 upon creation of a decoy file and/or an update to the file name, directory path, tag, etc., of the decoy file.
Updateable knowledge base 224 may further maintain a set of rules (e.g., predetermined rules) that indicate which types of file access operations to decoy file(s) 216 (or patterns thereof) are illegal (i.e., issued from a malicious process) or legal (i.e., issued from a non-malicious process). Operation analyzer 208 may analyze the file access operation(s) to identify a pattern associated with the file access operation(s). Operation analyzer 208 may apply the rule(s) to the identified pattern to determine whether the file access operation(s) originate from a non-malicious process or a malicious process. For example, a rule may specify that a particular file access operation followed by another particular file access operation is considered to be an illegal file access pattern. Thus, if the identified pattern conforms to this rule, operation analyzer 208 may determine the file access operation(s) detected by operation monitor 206 originated from a malicious process (e.g., process 220) and may provide an indication to operation monitor 206 that indicates that the file access operation(s) originate from a malicious process. If the identified pattern does not conform to this rule (or any other rule that indicates an illegal file access pattern), operation analyzer 208 may determine that the file access operation(s) detected by operation monitor 206 originated from a non-malicious process and may provide an indication to operation monitor 206 that indicates that the file access operation(s) do not originate from a malicious process. The rule(s) maintained in updateable knowledge base 224 may be periodically updated with new patterns (e.g., via a software update).
An example of a rule that specifies an illegal pattern may be a read operation that reads a portion of data from a file, a write operation that rewrites that portion with an encrypted version of that data, and repeating these operations until all the portions of data from the file are encrypted. Another example be a read operation that reads the whole file for data included therein, a create operation that creates a new file (having the same file name) that contains an encrypted version of that data, and a delete operation that deletes the original file.
Updateable knowledge base 224 may also store predetermined illegal pattern(s), and operation analyzer 208 may compare the file access operation(s) detected by operation monitor 206 to the file access operation(s) included in the stored, predetermined pattern(s) to determine whether the file access operation(s) match any of the pattern(s) stored therein. If operation analyzer 208 finds a match, operation analyzer 208 provides an indication to operation monitor 206 that indicates that the file access operation(s) originate from a malicious process. If operation analyzer 208 does not find a match, operation analyzer 208 provides an indication to operation monitor 206 that indicates that the file access operation(s) do not originate from a malicious process. The patterns stored in updateable knowledge base 224 may be periodically updated with new patterns (e.g., via a software update).
In addition to or in lieu of analyzing file access operation(s) using rule(s) and/or predetermined, stored pattern(s), malicious process detector 212 may utilize a machine-learning based technique to determine whether file access operations(s) originate from a non-malicious process or a malicious process. For example, pattern learning module 222 may train a machine-learning-based algorithm on observed file access patterns to file(s) 212. For instance, pattern learning module 222 may continuously receive information from operation monitor 206 that specifies file access operation(s) directed to file(s) 212 and analyze file access operation(s) that are directed to file(s) 212 over time (e.g., a day, a week, a month, a year, etc.). Generally, file access operation(s) directed to file(s) 212 are initiated by non-malicious processes, which initiate such file access operation(s) based on user-driven input. Thus, the machine-learning based algorithm may learn what constitutes legal file access pattern(s) based on the historical file access operations to file(s) 212. File access operation(s) (or pattern(s) thereof) to decoy file(s) 216 that deviate from the model (i.e., anomalous operation(s)/pattern(s)) may be designated as being illegal operations (i.e., such file access operation(s) are determined to originate from a malicious process).
As described above, operation analyzer 208 may analyze the file access operation(s) to decoy file(s) 216 identify a pattern associated with the file access operation(s). Operation analyzer 208 may provide the identified pattern as an input of the machine-learning-based algorithm of pattern learning module 222. The machine-learning-based algorithm may determine a probability that the identified pattern originates from a non-malicious process. The probability may be compared to a threshold. If the probability exceeds the threshold, the file access operation(s) are determined to be legal operations (i.e., such operation(s) are determined to originate from a non-malicious process), and the machine-learning-based algorithm of pattern learning module 222 outputs an indicator that indicates that the operation(s) to decoy file(s) 216 are not issued from a malicious process (e.g., process 220). The indicator is provided to operation monitor 206. If the probability does not exceed the threshold, the file access operation(s) are determined to be illegal operations (i.e., such operation(s) are determined to originate from a malicious process, and the machine-learning-based algorithm of pattern learning module 222 outputs an indicator that indicates that the operation(s) to decoy file(s) 216 are issued from a malicious process (e.g., process 220). Pattern learning module 222 may also update knowledge base 224 with the pattern identified to be originated form a malicious process and/or one or more rules specifying the identified pattern(s).
In accordance with an embodiment, the contents of updateable knowledge base (e.g., the decoy file identifier(s), the pattern(s), rule(s), and model) may be encrypted, thereby preventing a malicious process from tampering with the contents stored thereby.
Upon receiving an indication that the file access operation(s) issued to decoy file(s) 216 is from a malicious process (e.g., process 220), operation monitor 206 may perform one or more operations to neutralize the malicious process. For example, operation monitor 206 may cause the operating system (e.g., operating system 106) to terminate the malicious process, suspend the malicious process, perform a backup of file(s) 212, check the integrity of file(s) 212, record in an event log an event that indicates that a malicious process performed file access operation(s) to file(s) 212 to, prompt a user of the computing device (e.g., computing device 100) to indicate an operation to perform, and/or activate an anti-virus program or other security mechanism that is configured to neutralize the malicious process.
Accordingly, malicious process detector 214 may be configured to detect and/or neutralize a malicious process in many ways. For example,
Flowchart 300 begins with step 302. At step 302, one or more decoy files in a file directory that stores one or more other files is created. For example, as shown in
In accordance with one or more embodiments, one or more attributes of the one or more decoy are periodically modified such that a sorting operation performed on files stored in the directory causes the one or more decoy files to be listed before the other one or more files in a list generated by the sorting operation. For example, with reference to
In accordance with one or more embodiments, the attribute(s) comprise at least one of a file name, a file size, a creation date, or a modification date.
At step 304, one or more file access operations are determined to be performed with respect to at least one of the one or more decoy files. For example, with reference to
At step 306, the one or more file access operations are analyzed to determine whether the one or more file access operations originate from a malicious process. For example, with reference to
In accordance with one or more embodiments, a pattern associated with the one or more file access operations that are being performed with respect to the one or more decoy files are identified and one or more rules are applied to the pattern to determine whether the one or more file access operations originate from the malicious process. For example, with reference to
In accordance with one or more embodiments, the pattern associated with the one or more file access operation(s) comprises a read operation to the decoy file or to a portion thereof and a write operation to the same decoy file or the same portion thereof.
In accordance with one or more embodiments, the file access operation(s) are analyzed in accordance with a machine-learning-based algorithm. Additional details regarding the foregoing technique are described below with reference to
At step 308, in response to determining that the one or more file access operations originate from the malicious process, an action is performed to neutralize the malicious process. For example, with reference to
In accordance with an embodiment, comprises one or more of terminating the malicious process, suspending the malicious process, performing backup of the one or more other files stores in the file directory, checking an integrity of the one or more other files, activating an anti-virus program, recording in an event log an event that indicates that the malicious process performed the one or more file access operations to the one or more decoy files, or prompting a user of the computing device to indicate an operation to perform. In accordance with such an embodiment, operation monitor 206 may send a command to the operating system (e.g., operating system 106) that causes one or more of these operations to be performed.
Flowchart 500 begins with step 502. At step 502, a pattern associated with the one or more file access operations that are performed with respect to the one or more decoy files is identified. For example, with reference to
At step 504, the pattern is provided as an input to a machine-learning-based algorithm that outputs an indication of whether the pattern is a legal file access pattern or an illegal file access pattern, the machine-learning-based algorithm being trained on observed file access patterns for the one or more other files. For example, with reference to
In accordance with one or more embodiments, the machine-learning based algorithm outputs a probability that the pattern is a legal file access pattern and the probability is compared to a threshold to determine whether the pattern is a legal file access pattern.
The embodiments described herein, including systems, methods/processes, and/or apparatuses, may be implemented using well known processing devices, telephones (land line based telephones, conference phone terminals, smart phones and/or mobile phones), interactive television, servers, and/or, computers, such as a computer 700 shown in
Computer 700 can be any commercially available and well known communication device, processing device, and/or computer capable of performing the functions described herein, such as devices/computers available from International Business Machines®, Apple®, Sun®, HP®, Dell®, Cray®, Samsung®, Nokia®, etc. Computer 700 may be any type of computer, including a desktop computer, a server, etc.
Computer 700 includes one or more processors (also called central processing units, or CPUs), such as a processor 706. Processor 706 is connected to a communication infrastructure 702, such as a communication bus. In some embodiments, processor 706 can simultaneously operate multiple computing threads, and in some embodiments, processor 706 may comprise one or more processors.
Computer 700 also includes a primary or main memory 708, such as random access memory (RAM). Main memory 908 has stored therein control logic 724 (computer software), and data.
Computer 700 also includes one or more secondary storage devices 710. Secondary storage devices 710 include, for example, a hard disk drive 712 and/or a removable storage device or drive 714, as well as other types of storage devices, such as memory cards and memory sticks. For instance, computer 700 may include an industry standard interface, such a universal serial bus (USB) interface for interfacing with devices such as a memory stick. Removable storage drive 714 represents a floppy disk drive, a magnetic tape drive, a compact disk drive, an optical storage device, tape backup, etc.
Removable storage drive 714 interacts with a removable storage unit 716. Removable storage unit 716 includes a computer useable or readable storage medium 718 having stored therein computer software 726 (control logic) and/or data. Removable storage unit 716 represents a floppy disk, magnetic tape, compact disk, DVD, optical storage disk, or any other computer data storage device. Removable storage drive 714 reads from and/or writes to removable storage unit 716 in a well-known manner.
Computer 700 also includes input/output/display devices 704, such as touchscreens, LED and LCD displays, monitors, keyboards, pointing devices, etc.
Computer 700 further includes a communication or network interface 720. Communication interface 720 enables computer 700 to communicate with remote devices. For example, communication interface 720 allows computer 700 to communicate over communication networks or mediums 722 (representing a form of a computer useable or readable medium), such as LANs, WANs, the Internet, etc. Network interface 720 may interface with remote sites or networks via wired or wireless connections.
Control logic 728 may be transmitted to and from computer 900 via the communication medium 722.
Any apparatus or manufacture comprising a computer useable or readable medium having control logic (software) stored therein is referred to herein as a computer program product or program storage device. This includes, but is not limited to, computer 700, main memory 708, secondary storage devices 710, and removable storage unit 716. Such computer program products, having control logic stored therein that, when executed by one or more data processing devices, cause such data processing devices to operate as described herein, represent embodiments.
Techniques, including methods, and embodiments described herein may be implemented by hardware (digital and/or analog) or a combination of hardware with one or both of software and/or firmware. Techniques described herein may be implemented by one or more components. Embodiments may comprise computer program products comprising logic (e.g., in the form of program code or software as well as firmware) stored on any computer useable medium, which may be integrated in or separate from other components. Such program code, when executed by one or more processor circuits, causes a device to operate as described herein. Devices in which embodiments may be implemented may include storage, such as storage drives, memory devices, and further types of physical hardware computer-readable storage media. Examples of such computer-readable storage media include, a hard disk, a removable magnetic disk, a removable optical disk, flash memory cards, digital video disks, random access memories (RAMs), read only memories (ROM), and other types of physical hardware storage media. In greater detail, examples of such computer-readable storage media include, but are not limited to, a hard disk associated with a hard disk drive, a removable magnetic disk, a removable optical disk (e.g., CDROMs, DVDs, etc.), zip disks, tapes, magnetic storage devices, MEMS (micro-electromechanical systems) storage, nanotechnology-based storage devices, flash memory cards, digital video discs, RAM devices, ROM devices, and further types of physical hardware storage media. Such computer-readable storage media may, for example, store computer program logic, e.g., program modules, comprising computer executable instructions that, when executed by one or more processor circuits, provide and/or maintain one or more aspects of functionality described herein with reference to the figures, as well as any and all components, capabilities, and functions therein and/or further embodiments described herein.
Such computer-readable storage media are distinguished from and non-overlapping with communication media (do not include communication media). Communication media embodies computer-readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wireless media such as acoustic, RF, infrared, and other wireless media, as well as wired media and signals transmitted over wired media. Embodiments are also directed to such communication media.
The techniques and embodiments described herein may be implemented as, or in, various types of devices. For instance, embodiments may be included in mobile devices such as laptop computers, handheld devices such as mobile phones (e.g., cellular and smart phones), handheld computers, and further types of mobile devices, desktop and/or server computers. A device, as defined herein, is a machine or manufacture as defined by 35 U.S.C. § 101. Devices may include digital circuits, analog circuits, or a combination thereof. Devices may include one or more processor circuits (e.g., central processing units (CPUs) (e.g., processor 906 of
While various embodiments of the present invention have been described above, it should be understood that they have been presented by way of example only, and not limitation. It will be understood by those skilled in the relevant art(s) that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined in the appended claims. Accordingly, the breadth and scope of the present invention should not be limited by any of the above-described exemplary embodiments, but should be defined only in accordance with the following claims and their equivalents.
Guri, Mordechai, Yehoshua, Ronen, Gorelik, Michael
Patent | Priority | Assignee | Title |
Patent | Priority | Assignee | Title |
10169586, | Dec 31 2016 | Fortinet, INC | Ransomware detection and damage mitigation |
10289845, | Jan 19 2017 | International Business Machines Corporation | Protecting backup files from malware |
10409986, | Sep 26 2016 | EMC IP HOLDING COMPANY LLC | Ransomware detection in a continuous data protection environment |
10503897, | Jul 13 2016 | CYBEREASON; CYBEREASON INC | Detecting and stopping ransomware |
10609066, | Nov 23 2016 | EMC IP HOLDING COMPANY LLC | Automated detection and remediation of ransomware attacks involving a storage device of a computer network |
10733290, | Oct 26 2017 | SanDisk Technologies, Inc | Device-based anti-malware |
10769278, | Mar 30 2018 | Microsoft Technology Licensing, LLC | Service identification of ransomware impact at account level |
10917416, | Mar 30 2018 | Microsoft Technology Licensing, LLC | Service identification of ransomware impacted files |
8479276, | Dec 29 2010 | EMC IP HOLDING COMPANY LLC | Malware detection using risk analysis based on file system and network activity |
8549643, | Apr 02 2010 | The Trustees of Columbia University in the City of New York | Using decoys by a data loss prevention system to protect against unscripted activity |
20160180087, | |||
20160323316, | |||
20170134423, | |||
20180121650, | |||
20180157834, | |||
20180189490, | |||
EP3568791, | |||
WO2018130904, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Dec 28 2017 | MORPHISEC INFORMATION SECURITY 2014 LTD. | (assignment on the face of the patent) | / | |||
Jul 08 2019 | MORPHISEC INFORMATION SECURITY 2014 LTD | Silicon Valley Bank | INTELLECTUAL PROPERTY SECURITY AGREEMENT | 049706 | /0331 | |
Sep 26 2019 | GORELIK, MICHAEL | MORPHISEC INFORMATION SECURITY 2014 LTD | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 051593 | /0811 | |
Oct 31 2019 | GURI, MORDECHAI | MORPHISEC INFORMATION SECURITY 2014 LTD | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 051593 | /0811 | |
Nov 03 2019 | YEHOSHUA, RONEN | MORPHISEC INFORMATION SECURITY 2014 LTD | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 051593 | /0811 | |
Oct 04 2021 | Silicon Valley Bank | MORPHISEC INFORMATION SECURITY 2014 LTD | RELEASE BY SECURED PARTY SEE DOCUMENT FOR DETAILS | 057781 | /0949 | |
Oct 01 2024 | MORPHISEC INFORMATION SECURITY 2014 LTD | HERCULES CAPITAL, INC | SECURITY INTEREST SEE DOCUMENT FOR DETAILS | 068758 | /0581 |
Date | Maintenance Fee Events |
Jul 10 2019 | BIG: Entity status set to Undiscounted (note the period is included in the code). |
Jul 16 2019 | SMAL: Entity status set to Small. |
Date | Maintenance Schedule |
May 09 2026 | 4 years fee payment window open |
Nov 09 2026 | 6 months grace period start (w surcharge) |
May 09 2027 | patent expiry (for year 4) |
May 09 2029 | 2 years to revive unintentionally abandoned end. (for year 4) |
May 09 2030 | 8 years fee payment window open |
Nov 09 2030 | 6 months grace period start (w surcharge) |
May 09 2031 | patent expiry (for year 8) |
May 09 2033 | 2 years to revive unintentionally abandoned end. (for year 8) |
May 09 2034 | 12 years fee payment window open |
Nov 09 2034 | 6 months grace period start (w surcharge) |
May 09 2035 | patent expiry (for year 12) |
May 09 2037 | 2 years to revive unintentionally abandoned end. (for year 12) |