Automatically detecting a malicious file using name mangling strings. In one embodiment, a method may include (a) identifying a file, (b) identifying name mangling strings in the file, (c) concatenating the name mangling strings together, (d) hashing the concatenated name mangling strings to generate a signature for the file, (e) clustering the file with other files with matching signatures into a cluster, (f) determining that any of the files in the cluster is malicious, (g) adding the signature to a list of signatures of files known to be malicious, (f) identifying a network device file stored on a network device, (g) repeating (b)-(d) on the network device file, (h) determining that the signature for the network device file matches any signature in the list of signatures of files known to be malicious, and (i) performing a security action on the malicious file on the network device.
|
1. A computer-implemented method for automatically detecting a malicious file using name mangling strings, at least a portion of the method being performed by a computing device comprising at least one hardware processor, the method comprising:
(a) identifying a file;
(b) identifying name mangling strings in the file;
(c) concatenating the name mangling strings together;
(d) hashing the concatenated name mangling strings to generate a signature for the file;
(e) clustering the file together with other files having signatures that match the signature of the file with the name mangling strings into a cluster;
(f) determining that one of the files in the cluster is malicious;
(g) adding the signature to a list of signatures of files known to be malicious;
(h) identifying a network device file stored on a network device;
(i) repeating (b)-(d) on the network device file;
(j) determining that a signature for the network device file matches any signature in the list of signatures of files known to be malicious and that the network device file is therefore a malicious file; and
(k) performing a security action on the malicious file on the network device to protect the network device.
8. A computer-implemented method for automatically detecting a malicious file using name mangling strings, at least a portion of the method being performed by a computing device comprising at least one hardware processor, the method comprising:
(a) identifying a portable executable (pe) file;
(b) identifying name mangling strings in a data section of the pe file;
(c) concatenating the name mangling strings together;
(d) hashing the concatenated name mangling strings to generate a signature for the pe file;
(e) clustering the pe file together with other pe files having signatures that match the signature of the pe file with the name mangling strings into a cluster;
(f) determining that any of the pe files in the cluster is malicious;
(g) adding the signature to a list of signatures of pe files known to be malicious;
(h) identifying a network device pe file stored on a network device;
(i) repeating (b)-(d) on the network device pe file;
(j) determining that a signature for the network device pe file matches any signature in the list of signatures of pe files known to be malicious and that the network device pe file is therefore a malicious pe file; and
(k) performing a security action on the malicious pe file on the network device to protect the network device.
14. One or more non-transitory computer-readable storage media comprising one or more computer-readable instructions that, when executed by one or more hardware processors of one or more computing devices, cause the one or more computing devices to perform a method for automatically detecting a malicious file using name mangling strings, the method comprising:
(a) identifying a file;
(b) identifying name mangling strings in the file;
(c) concatenating the name mangling strings together;
(d) hashing the concatenated name mangling strings to generate a signature for the file;
(e) clustering the file together with other files having signatures that match the signature of the file with the name mangling strings into a cluster;
(f) determining that any of the files in the cluster is malicious;
(g) adding the signature to a list of signatures of files known to be malicious;
(h) identifying a network device file stored on a network device;
(i) repeating (b)-(d) on the network device file;
(j) determining that a signature for the network device file matches any signature in the list of signatures of files known to be malicious and that the network device file is therefore a malicious file; and
(k) performing a security action on the malicious file on the network device to protect the network device.
2. The method of
3. The method of
4. The method of
5. The method of
6. The method of
7. The method of
9. The method of
10. The method of
11. The method of
12. The method of
13. The method of
15. The one or more non-transitory computer-readable storage media of
16. The one or more non-transitory computer-readable storage media of
17. The one or more non-transitory computer-readable storage media of
18. The one or more non-transitory computer-readable storage media of
19. The one or more non-transitory computer-readable storage media of
20. The one or more non-transitory computer-readable storage media of
|
Some network security applications function to detect malicious files stored on network devices before the malicious files can be executed or otherwise employed in damaging the network or network devices. Examples of malicious files include files that contain viruses or malware.
For example, a network security application may attempt to identify a malicious file by comparing a suspicious file that is suspected of being malicious to a list of files known to be malicious. If the suspicious file matches any of the files in the list of files known to be malicious, the network security application may identify the suspicious file as a malicious file. However, purveyors of malicious files have attempted to avoid detection of their malicious files by causing variations of their malicious files to have minor differences which cause the malicious files to not match up exactly with the files in a list of malicious files. The proliferation of these variations of malicious files has rendered many network security applications incapable of accurately detecting many malicious files, which leaves network devices vulnerable to undetected malicious files.
The subject matter claimed herein is not limited to embodiments that solve any disadvantages or that operate only in environments such as those described above. Rather, this background is only provided to illustrate one example technology area where some embodiments described herein may be practiced.
In one embodiment, a computer-implemented method for automatically detecting a malicious file using name mangling strings may be performed, at least in part, by a computing device including at least one processor. The method may include (a) identifying a file, (b) identifying name mangling strings in the file, (c) concatenating the name mangling strings together, (d) hashing the concatenated name mangling strings to generate a signature for the file, (e) clustering the file with other files with matching signatures into a cluster, (f) determining that one of the files in the cluster is malicious, (g) adding the signature to a list of signatures of files known to be malicious, (f) identifying a network device file stored on a network device, (g) repeating (b)-(d) on the network device file, (h) determining that the signature for the network device file matches any signature in the list of signatures of files known to be malicious and that the network device file is therefore a malicious file, and (i) performing a security action on the malicious file on the network device to protect the network device.
In some embodiments, the file and the network device file may be Portable Executable (PE) files. Also, in some embodiments, (b) may include identifying name mangling strings in a Data section of the PE file. Further, in some embodiments, (c) may include concatenating the name mangling strings together with one or more delimiter characters between each of the name mangling strings. Also, in some embodiments, (c) may include concatenating the name mangling strings together in a sequence that matches a sequence that the name mangling strings are listed in a Data section of a PE file. Further, in some embodiments, (d) may include performing a Secure Hash Algorithm-256 (SHA-256) algorithm on the concatenated name mangling strings to generate the signature for the file. Also, in some embodiments, (i) may include quarantining the malicious file on, or removing the malicious file from, the network device to prevent the malicious file from executing on the network device.
Further, in some embodiments, one or more non-transitory computer-readable media may include one or more computer-readable instructions that, when executed by one or more processors, cause the one or more processors to perform a method for automatically detecting a malicious file using name mangling strings.
It is to be understood that both the foregoing summary and the following detailed description are explanatory and are not restrictive of the invention as claimed.
Embodiments will be described and explained with additional specificity and detail through the use of the accompanying drawings in which:
Some embodiments in this disclosure relate to automatically detecting a malicious file using name mangling strings.
Some network security applications function to detect malicious files stored on network devices, such as files that include viruses or malware, before the malicious files can be executed or otherwise employed in damaging the network devices. For example, a network security application may attempt to identify malicious files by comparing a file signature of a suspicious file that is suspected of being malicious to a list of file signatures of files known to be malicious. If the suspicious file's signature matches any of the file signatures in the list of malicious files, the network security application may identify the suspicious file as a malicious file. However, purveyors of malicious files have attempted to avoid detection of their malicious files by causing variations of their malicious files to have minor differences which cause the file signatures of the malicious files to not match up exactly with any of the file signatures in lists of malicious files. The proliferation of these variations of malicious files has rendered many network security applications incapable of detecting many malicious files, which leaves networks and network devices vulnerable to undetected malicious files.
The embodiments disclosed herein may enable automatically detecting a malicious file using name mangling strings. In some embodiments, automatically detecting a malicious file using name mangling strings may include generating a file signature for a file by identifying name mangling strings in the file, concatenating the name mangling strings together, and hashing the concatenated name mangling strings to generate a file signature for the file. These file signatures generated using name mangling strings can be employed in the generation of a list of file signatures of files known to be malicious and in generating a suspicious file's signature. Even when purveyors of malicious files cause variations of their malicious files to have minor differences, a network security application that detects malicious files using file signatures generated using name mangling strings may be capable of accurately clustering variations of a malicious file together as matching malicious files with few or no false positives in the cluster. This may render the network security application capable of more accurately detecting malicious files, which leaves networks and network devices less vulnerable to undetected malicious files.
Turning to the figures,
In some embodiments, the network 102 may be configured to communicatively couple the network devices 104a-104n to one another as well as to the security server 106. In some embodiments, the network 102 may be any wired or wireless network, or combination of multiple networks, configured to send and receive communications between systems and devices. In some embodiments, the network 102 may include a Personal Area Network (PAN), a Local Area Network (LAN), a Metropolitan Area Network (MAN), a Wide Area Network (WAN), a Storage Area Network (SAN), or some combination thereof. In some embodiments, the network 102 may also be coupled to, or may include, portions of a telecommunications network, including telephone lines, for sending data in a variety of different communication protocols, such as a cellular network or a Voice over IP (VoIP) network.
In some embodiments, each of the network devices 104a-104n may be any computer system capable of communicating over the network 102, examples of which are disclosed herein in connection with the computer system 400 of
In some embodiments, the security server 106 may be any computer system capable of communicating over the network 102 and capable of monitoring the network devices 104a-104n, examples of which are disclosed herein in connection with the computer system 400 of
Modifications, additions, or omissions may be made to the system 100 without departing from the scope of the present disclosure. For example, in some embodiments, the system 100 may include additional components similar to the components illustrated in
The name mangling strings in the data 204 of the file 202 may function to encode function and variable names of an executable file into unique names so that linkers can separate common names in the language of the executable file. The name mangling strings in the data 204 of the file 202 may be found in the Data section 208 of the file 202. The name mangling strings may be identified by one or more particular delimiter character(s) at the beginning and the end of each name mangling string. For example, the file 202 may include name mangling strings that include a ‘?’ delimiter character at the beginning of the name mangling string and two ‘@@’ characters at the end of the name mangling string, such as the name mangling strings 208, 210, and 212.
The name mangling strings of the file 202 may be employed in the generation of a file signature for the file 202. This file signature may then be compared to a list of file signatures that were similarly generated for files known to be malicious files. If there is a match, the file 202 may be determined to be a malicious file, and the file may be added to a list of malicious files, such as the list of malicious files 114 of
Modifications, additions, or omissions may be made to the example file 202 illustrated in the user interface 200 of
The method 300 may include, at block 302, identifying a file. In some embodiments, the file identified at block 302 (and the network device file identified at block 324, as discussed below) may be a Portable Executable (PE) file. For example, the security module 108 may identify, at block 302, the file 202, which may be a PE file and which may be the file 109a stored on the security server 106. In this example, the files 109a-109n may be files that are targeted to have file signatures generated and that are targeted to be added to the file clusters 110.
The method 300 may include, at block 304, identifying name mangling strings in the file. In some embodiments, the name mangling strings identified at block 304 may include all name mangling strings in the file or a subset of all the name mangling strings in the file, such as all name mangling strings in a section of the file. For example, the security module 108 may identify, at block 304, name mangling strings 208, 210, and 212 in the Data section 206 of the file 202. In this example, the name mangling strings 208 may also include all other name mangling strings in the Data section 206 of the file 202.
The method 300 may include, at block 306, concatenating the name mangling strings together. In some embodiments, the concatenating at block 306 may include concatenating the name mangling strings together with one or more delimiter characters between each of the name mangling strings. In some embodiments, the concatenating at block 306 may include concatenating the name mangling strings together in a sequence that matches a sequence that the name mangling strings are listed in a Data section of a PE file. For example, the security module 108 may concatenate, at block 306, the name mangling strings 208, 210, and 212 together, as well as all other name mangling strings in the Data section 206 of the PE file 202, with one or more delimiter characters between each of the name mangling strings, and in a sequence that matches a sequence that the name mangling strings are listed in the Data section 206 of the PE file 202.
The method 300 may include, at block 308, hashing the concatenated name mangling strings to generate a signature for the file. In some embodiments, the hashing at block 308 may include performing a Secure Hash Algorithm-256 (SHA-256) algorithm on the concatenated name mangling strings to generate the signature for the file. For example, the security module 108 may perform, at block 308, a SHA-256 algorithm on the concatenated name mangling strings to generate a signature for the file 202.
The method 300 may include, at block 310, clustering the file with other files with matching signatures into a cluster. For example, the security module 108 may cluster, at block 310, the file 202 with other files with matching signatures into a cluster 112a in the file clusters 110 of the security server 106.
The method 300 may include, at decision block 312, determining whether there are more files targeted for having a file signature generated and targeted for being added to a cluster. If so (Yes at decision block 312), the method 300 may return to block 302 to process the next file. If not (No at decision block 312), the method 300 may continue to block 314. For example, the security module 108 may determine, at decision block 312, that the files 109b-109n stored on the security server 106 are still waiting to have a file signature generated and waiting to be added to one of the file clusters 110 (Yes at decision block 312). Therefore, the method 300 may return to block 302 to process the next file 109b. Alternatively, the security module 108 may determine, at decision block 312, that all of the files 109a-109n have already had file signatures generated and have been added to one of the file clusters 110 (No at decision block 312). Therefore, the method 300 may continue to block 314.
The method 300 may include, at block 314, identifying a cluster. For example, the security module 108 may identify, at block 314, the cluster 112a in the file clusters 110.
The method 300 may include, at decision block 316, determining whether one of the files in the cluster is malicious. If so (Yes at decision block 316), the method 300 may include, at block 318, adding the signature to a list of signatures of files known to be malicious. If not (No at decision block 316), the method 300 may include, at block 320, adding the signature to a list of signatures of files known to be clean. For example, the security module 108 may determine, at decision block 316, that the cluster 112a includes a file that is malicious and may, at block 318, add the file signature of the file (which is the same as all the other file signatures in the cluster 112a) to the list of signatures in the list of malicious files 114. Alternatively, the security module 108 may determine, at decision block 316, that the cluster 112a includes a file that is not malicious and may, at block 320, add the file signature of the file (which is the same as all the other file signatures in the cluster 112a) to the list of signatures in the list of clean files 116.
The method 300 may include, at decision block 322, determining whether there are more clusters targeted for being categorized as either malicious or clean. If so (Yes at decision block 322), the method 300 may return to block 314 to categorize the next cluster. If not (No at decision block 322), the method 300 may continue to block 324. For example, the security module 108 may determine, at decision block 322, that the clusters 112b-112n in the file clusters 110 are still waiting to be categorized as either malicious or clean (Yes at decision block 322). Therefore, the method 300 may return to block 314 to categorize the next cluster 112b. Alternatively, the security module 108 may determine, at decision block 322, that all of the clusters 112a-112n in the file clusters 110 have already been categorized as either malicious or clean (No at decision block 322). Therefore, the method 300 may continue to block 324.
The method 300 may include, at block 324, identifying a network device file stored on a network device. For example, the security module 108 may identify, at block 324, the file 103a stored on the network device 104a.
The method 300 may include, at block 326, repeating blocks 304-308 on the network device file. For example, the security module 108 may repeat, at block 326, blocks 304-308 on the file 103a in order to identify, at block 304, name mangling strings in the file 103a, concatenate, at block 306, the name mangling strings together, and hash, at block 308, the concatenated name mangling strings to generate a signature for the file 103a.
The method 300 may include, at decision block 328, determining whether the signature matches any signature in the list of signatures of files known to be malicious. If so (Yes at decision block 328), the method 300 may include, at block 330, identifying the network device file as a malicious file and, at block 332, performing a security action on the malicious file on the network device to protect the network device. If not (No at decision block 328), the method 300 may proceed to decision block 334. In some embodiments, the performing of the security action at block 332 may include quarantining the malicious file on, or removing the malicious file from, the network device to prevent the malicious file from executing on the network device. For example, the security module 108 may determine, at decision block 328, that the signature of the file 103a matches any signature in the list of malicious files 114 and may therefore identify, at block 330, the file 103a stored on the network device 104a as a malicious file and may also perform, at block 332, a security action on the malicious file 103a on the network device 104a to protect the network device 104a from the malicious file 103a, such as quarantining the malicious file 103a on, or removing the malicious file 103a from, the network device 104a to prevent the malicious file 103a from executing on the network device 104a. Alternatively, the security module 108 may determine, at decision block 328, that the signature of the file 103a does not match any signature in the list of malicious files 114. Therefore, the method 300 may continue to decision block 334.
The method 300 may include, at decision block 334, determining whether the signature matches any signature in the list of signatures of files known to be clean. If so (Yes at decision block 334), the method 300 may include, at block 336, identifying the network device file as a clean file. If not (No at decision block 334), the method 300 may include, at block 338, identifying the network device file as an unknown file. For example, the security module 108 may determine, at decision block 334, that the signature of the file 103a matches any signature in the list of clean files 116 and may therefore identify, at block 336, the file 103a stored on the network device 104a as a clean file. Alternatively, the security module 108 may determine, at decision block 334, that the signature of the file 103a does not match any signature in the list of clean files 116 and may therefore identify, at block 338, the file 103a as an unknown file.
In some embodiments, even when purveyors of malicious files cause variations of their malicious files to have minor differences, the method 300 may be capable of more accurately clustering variations of a malicious file together as matching malicious files with few or no false positives in the cluster. This renders the method 300 capable of more accurately detecting malicious files, which leaves networks and network devices less vulnerable to undetected malicious files.
Although the blocks of the method 300 are illustrated in
Further, it is understood that the method 300 may improve the functioning of a network environment. For example, the functioning of the security server 106 or any of the network devices 104a-104n of
Also, the method 300 may improve the technical field of detecting malicious files and securing network devices against malicious files. Employing name mangling strings in the generation of more accurate file signatures, and then employing the more accurate file signatures for detecting malicious files, is an improvement over conventional attempts at detecting malicious files using conventional file signatures.
The computer system 400 may include a processor 402, a memory 404, a file system 406, a communication unit 408, an operating system 410, a user interface 412, and a security module 414, which all may be communicatively coupled. In some embodiments, the computer system may be, for example, a desktop computer, a client computer, a server computer, a mobile phone, a laptop computer, a smartphone, a smartwatch, a tablet computer, a portable music player, or any other computer system.
Generally, the processor 402 may include any suitable special-purpose or general-purpose computer, computing entity, or processing device including various computer hardware or software modules and may be configured to execute instructions stored on any applicable computer-readable storage media. For example, the processor 402 may include a microprocessor, a microcontroller, a digital signal processor (DSP), an application-specific integrated circuit (ASIC), a Field-Programmable Gate Array (FPGA), or any other digital or analog circuitry configured to interpret and/or to execute program instructions and/or to process data, or any combination thereof. In some embodiments, the processor 402 may interpret and/or execute program instructions and/or process data stored in the memory 404 and/or the file system 406. In some embodiments, the processor 402 may fetch program instructions from the file system 406 and load the program instructions into the memory 404. After the program instructions are loaded into the memory 404, the processor 402 may execute the program instructions. In some embodiments, the instructions may include the processor 402 performing one or more blocks of the method 300 of
The memory 404 and the file system 406 may include computer-readable storage media for carrying or having stored thereon computer-executable instructions or data structures. Such computer-readable storage media may be any available non-transitory media that may be accessed by a general-purpose or special-purpose computer, such as the processor 402. By way of example, and not limitation, such computer-readable storage media may include non-transitory computer-readable storage media including Read-Only Memory (ROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Compact Disc Read-Only Memory (CD-ROM) or other optical disk storage, magnetic disk storage or other magnetic storage devices, flash memory devices (e.g., solid state memory devices), or any other storage media which may be used to carry or store desired program code in the form of computer-executable instructions or data structures and which may be accessed by a general-purpose or special-purpose computer. Combinations of the above may also be included within the scope of computer-readable storage media. Computer-executable instructions may include, for example, instructions and data configured to cause the processor 402 to perform a certain operation or group of operations, such as one or more blocks of the method 300 of
The communication unit 408 may include any component, device, system, or combination thereof configured to transmit or receive information over a network, such as the network 102 of
The operating system 410 may be configured to manage hardware and software resources of the computer system 400 and configured to provide common services for the computer system 400.
The user interface 412 may include any device configured to allow a user to interface with the computer system 400. For example, the user interface 412 may include a display, such as an LCD, LED, or other display, that is configured to present video, text, application user interfaces, and other data as directed by the processor 402. The user interface 412 may further include a mouse, a track pad, a keyboard, a touchscreen, volume controls, other buttons, a speaker, a microphone, a camera, any peripheral device, or other input or output device. The user interface 412 may receive input from a user and provide the input to the processor 402. Similarly, the user interface 412 may present output to a user.
The security module 414 may be one or more computer-readable instructions stored on one or more non-transitory computer-readable media, such as the memory 404 or the file system 406, that, when executed by the processor 402, is configured to perform one or more blocks of the method 300 of
Modifications, additions, or omissions may be made to the computer system 400 without departing from the scope of the present disclosure. For example, although each is illustrated as a single component in
As indicated above, the embodiments described herein may include the use of a special purpose or general purpose computer (e.g., the processor 402 of
In some embodiments, the different components and modules described herein may be implemented as objects or processes that execute on a computing system (e.g., as separate threads). While some of the methods described herein are generally described as being implemented in software (stored on and/or executed by general purpose hardware), specific hardware implementations or a combination of software and specific hardware implementations are also possible and contemplated.
In accordance with common practice, the various features illustrated in the drawings may not be drawn to scale. The illustrations presented in the present disclosure are not meant to be actual views of any particular apparatus (e.g., device, system, etc.) or method, but are merely example representations that are employed to describe various embodiments of the disclosure. Accordingly, the dimensions of the various features may be arbitrarily expanded or reduced for clarity. In addition, some of the drawings may be simplified for clarity. Thus, the drawings may not depict all of the components of a given apparatus (e.g., device) or all operations of a particular method.
Terms used herein and especially in the appended claims (e.g., bodies of the appended claims) are generally intended as “open” terms (e.g., the term “including” should be interpreted as “including, but not limited to,” the term “having” should be interpreted as “having at least,” the term “includes” should be interpreted as “includes, but is not limited to,” etc.).
Additionally, if a specific number of an introduced claim recitation is intended, such an intent will be explicitly recited in the claim, and in the absence of such recitation no such intent is present. For example, as an aid to understanding, the following appended claims may contain usage of the introductory phrases “at least one” and “one or more” to introduce claim recitations. However, the use of such phrases should not be construed to imply that the introduction of a claim recitation by the indefinite articles “a” or “an” limits any particular claim containing such introduced claim recitation to embodiments containing only one such recitation, even when the same claim includes the introductory phrases “one or more” or “at least one” and indefinite articles such as “a” or “an” (e.g., “a” and/or “an” should be interpreted to mean “at least one” or “one or more”); the same holds true for the use of definite articles used to introduce claim recitations.
In addition, even if a specific number of an introduced claim recitation is explicitly recited, it is understood that such recitation should be interpreted to mean at least the recited number (e.g., the bare recitation of “two recitations,” without other modifiers, means at least two recitations, or two or more recitations). Furthermore, in those instances where a convention analogous to “at least one of A, B, and C, etc.” or “one or more of A, B, and C, etc.” is used, in general such a construction is intended to include A alone, B alone, C alone, A and B together, A and C together, B and C together, or A, B, and C together, etc. For example, the use of the term “and/or” is intended to be construed in this manner.
Further, any disjunctive word or phrase presenting two or more alternative terms, whether in the summary, detailed description, claims, or drawings, should be understood to contemplate the possibilities of including one of the terms, either of the terms, or both terms. For example, the phrase “A or B” should be understood to include the possibilities of “A” or “B” or “A and B.”
Additionally, the use of the terms “first,” “second,” “third,” etc., are not necessarily used herein to connote a specific order or number of elements. Generally, the terms “first,” “second,” “third,” etc., are used to distinguish between different elements as generic identifiers. Absence a showing that the terms “first,” “second,” “third,” etc., connote a specific order, these terms should not be understood to connote a specific order. Furthermore, absence a showing that the terms first,” “second,” “third,” etc., connote a specific number of elements, these terms should not be understood to connote a specific number of elements. For example, a first widget may be described as having a first side and a second widget may be described as having a second side. The use of the term “second side” with respect to the second widget may be to distinguish such side of the second widget from the “first side” of the first widget and not to connote that the second widget has two sides.
The foregoing description, for purpose of explanation, has been described with reference to specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the invention as claimed to the precise forms disclosed. Many modifications and variations are possible in view of the above teachings. The embodiments were chosen and described to explain practical applications, to thereby enable others skilled in the art to utilize the invention as claimed and various embodiments with various modifications as may be suited to the particular use contemplated.
Ghosh, Swapan Kumar, M, Yuvaraj, Govindarajan, Srinivasan
Patent | Priority | Assignee | Title |
10320818, | Feb 14 2017 | CA, INC | Systems and methods for detecting malicious computing events |
10824723, | Sep 26 2018 | JPMORGAN CHASE BANK, N A , AS ADMINISTRATIVE AGENT | Identification of malware |
11372982, | Jul 02 2020 | Bank of America Corporation | Centralized network environment for processing validated executable data based on authorized hash outputs |
11423160, | Apr 16 2020 | Bank of America Corporation | System for analysis and authorization for use of executable environment data in a computing system using hash outputs |
11481484, | Apr 16 2020 | Bank of America Corporation | Virtual environment system for secure execution of program code using cryptographic hashes |
Patent | Priority | Assignee | Title |
8015491, | Feb 28 2006 | Verizon Patent and Licensing Inc | Systems and methods for a single development tool of unified online and offline content providing a similar viewing experience |
9152641, | Dec 15 2011 | SanDisk Technologies LLC | Method and system for providing storage device file location information |
20090158432, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Jun 20 2017 | GOVINDARAJAN, SRINIVASAN | Symantec Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 042773 | /0650 | |
Jun 20 2017 | M, YUVARAJ | Symantec Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 042773 | /0650 | |
Jun 20 2017 | GHOSH, SWAPAN KUMAR | Symantec Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 042773 | /0650 | |
Jun 21 2017 | Symantec Corporation | (assignment on the face of the patent) | / | |||
Nov 04 2019 | Symantec Corporation | CA, INC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 051144 | /0918 |
Date | Maintenance Fee Events |
Sep 23 2022 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Date | Maintenance Schedule |
Mar 26 2022 | 4 years fee payment window open |
Sep 26 2022 | 6 months grace period start (w surcharge) |
Mar 26 2023 | patent expiry (for year 4) |
Mar 26 2025 | 2 years to revive unintentionally abandoned end. (for year 4) |
Mar 26 2026 | 8 years fee payment window open |
Sep 26 2026 | 6 months grace period start (w surcharge) |
Mar 26 2027 | patent expiry (for year 8) |
Mar 26 2029 | 2 years to revive unintentionally abandoned end. (for year 8) |
Mar 26 2030 | 12 years fee payment window open |
Sep 26 2030 | 6 months grace period start (w surcharge) |
Mar 26 2031 | patent expiry (for year 12) |
Mar 26 2033 | 2 years to revive unintentionally abandoned end. (for year 12) |