Context-aware training systems, apparatuses and systems. The context-aware training systems, apparatuses and systems are computer-implemented and include sensing a user action and, based on a training needs model, estimating a cost or benefit to exposing the user to a training action, selecting a training action from a collection of available training actions and delivering the training action to the user if the user action indicates a need for the user to be trained and the cost or benefit to exposing the user to the training action indicates user exposure to the training action is warranted.
|
1. A computer-implemented method for training a user, comprising:
by a sensor, sensing at least one action performed by the user and data corresponding to the at least one action to yield sensed data, wherein:
the sensor comprises a usb sensor device and the sensing comprises detecting that a user of an electronic device has connected a usb device to the electronic device,
the sensor comprises a Wi-Fi sensor device and the sensing comprises detecting that the user has connected the electronic device to a Wi-Fi access point,
the sensor comprises a bluetooth headset usage detection sensor and the sensing comprises identifying bluetooth headset use,
the sensor comprises a camera coupled with a computer vision program, or
the sensor comprises a physical location tracking sensor and the sensing comprises detecting a location of the user;
by a computer system comprising a processor, executing computer-readable programming instructions that cause the system to:
access a data storage device that stores a training needs model that comprises an estimate of at least one of a cost and a benefit of exposing the user to at least one available training action,
apply, to the training needs model, the sensed data to identify a cybersecurity threat scenario for which the user is at risk,
access a collection of available training actions and select, based on the sensed data and the cybersecurity threat scenario, at least one training action from the collection of available training actions, and
cause the selected at least one cybersecurity training action to be delivered to the user.
49. A computer-implemented training system, comprising:
a sensor configured to monitor at least one action performed by the user, wherein:
the sensor comprises a usb sensor device and the action comprises that a user of an electronic device has connected a usb device to the electronic device,
the sensor comprises a Wi-Fi sensor device and the action comprises that the user has connected the electronic device to a Wi-Fi access point,
the sensor comprises a bluetooth headset usage detection sensor and the sensing comprises identifying bluetooth headset use,
the sensor comprises a camera coupled with a computer vision program, or
the sensor comprises a physical location tracking sensor and the sensing comprises detecting a location of the user;
the electronic device; and
a computer system that includes at least one processor, the computer system containing instructions which, when executed by the at least one processor, causes the computer system to:
receive data from the sensor, the data pertaining to the performance by the user of at least one of the actions;
apply the data to a training needs model to identify a cybersecurity threat scenario to which the user is at risk and estimate at least one of a cost and a benefit of exposing the user to at least one cybersecurity training action trained;
access a collection of training actions and select one or more cybersecurity training actions from the collection that are relevant to the cybersecurity threat scenario and the sensed data; and
provide the selected one or more cybersecurity training actions to the user through the electronic device.
2. The method of
3. The method of
4. The method of
5. The method of
6. The method of
7. The method of
8. The method of
9. The method of
10. The method of
11. The method of
12. The method of
13. The method of
14. The method of
15. The method of
16. The method of
17. The method of
18. The method of
19. The method of
20. The method of
21. The method of
22. The method of
23. The method of
26. The method of
27. The method of
28. The method of
31. The method of
34. The method of
35. The method of
selecting the at least one cybersecurity training action comprises selecting a plurality of cybersecurity training actions; and
the programming instructions also cause the system to prioritize the selected plurality of cybersecurity training actions.
36. The method of
37. The method of
38. The method of
39. The method of
40. The method of
41. The method of
42. The method of
43. The method of
44. The method of
applying the data for the plurality of different activities to the training needs model to identify a pattern indicative of susceptibility to a threat; and
estimating a risk associated with the threat.
45. The method of
46. The method of
47. The method of
48. The method of
accessing historical training data for the user, wherein the historical training data comprises training modules to which the user has already been exposed, and data corresponding to the user's response to the training modules to which the user has already been exposed; and
using the historical training data for the user to select the at least one cybersecurity training action.
50. The system of
51. The system of
52. The system of
53. The system of
54. The system of
55. The system of
56. The system of
57. The system of
58. The system of
59. The system of
60. The system of
61. The system of
62. The system of
63. The system of
64. The system of
65. The system of
66. The system of
the sensor comprises a usb data storage device sensor; and
the sensed action comprises detection that a usb drive has been connected to a user computing device.
67. The system of
access historical training data for the user, wherein the historical training data comprises training modules to which the user has already been exposed, and data corresponding to the user's response to the training modules to which the user has already been exposed; and
use the historical training data for the user to select the at least one cybersecurity training action.
68. The system of
the sensor is a component of a user computing device; and
the computer system comprises a remote analysis host computer.
69. The system of
the sensor comprises a Wi-Fi roaming sensor; and
the received data pertaining to the performance by the user of an action comprises information that the user has connected the electronic device to a Wi-Fi network.
70. The system of
the sensor comprises a locator sensor; and
the received data pertaining to the performance by the user of an action comprises a global positioning system (GPS) location of the electronic device of the user.
|
This application claims the benefit under Title 35, United States Code §119(e), of U.S. Provisional Patent Application Ser. No. 61/473,384, filed Apr. 8, 2011 and entitled Behavior Sensitive Training System, which is hereby incorporated by reference in its entirety, and U.S. Provisional Patent Application Ser. No. 61/473,366, filed Apr. 8, 2011 and entitled System and Method for Teaching the Recognition of Fraudulent Messages by Identifying Traps Within the Message, which is hereby incorporated by reference in its entirety.
This invention pertains generally to context-aware training and, particularly to training systems, apparatuses, and methods that select and provide training to a user based on action of a user.
Computer-Based Training systems and other forms of electronically supported learning and teaching (generically referred to as e-Learning systems) have traditionally relied on one-size-fits all training material, where the same collection of modules has to be taken by everyone. These modules may come in many different forms, including videos, flash-based presentations, simulations, training games and more. Independently of their format, they traditionally follow a fixed curriculum, where a predefined sequence of modules is prescribed for groups of individuals. Intelligent tutoring systems have introduced more sophisticated forms of computer-based training, where one develops and refines models of what the learner knows, and dynamically adapts learning content presented to the learner as these models evolve. When well designed, these systems have been shown to result in better outcomes than more traditional training modules.
Accordingly, it may be desirable to have a computer based training system that leverages sensed activity or behavior information in combination with user needs models that map those activities or behaviors onto quantitative or qualitative metrics indicating how critical it is for users engaging in these particular activities and behaviors to be knowledgeable of and proficient in different topics or training areas. Thus, embodiments of the present invention include computer-implemented systems and methods to selectively prioritize those areas where the learner needs to be trained and to selectively identify conditions where delivery of the training is likely to be most effective. That level of customization is thought to be particularly valuable in domains where training content is vast or opportunities for training are limited (e.g. limited time), and where the training required by individual users varies based on their activities and behaviors. Identifying training needs based on static information (e.g. based solely on the department an employee works for, or his/her level of education) is thought to be insufficient in these domains. Sensing activities, behaviors, or other contextual attributes can help better target training and mitigate consequences associated with undesirable behaviors.
In an embodiment, the present invention includes a computer-implemented method for training a user. That method includes sensing, using a computer system that includes at least one processor, at least one action performed by the user, selecting, using the computer system, at least one training action from a collection of available training actions using a training needs model that estimates at least one of a cost and a benefit of exposing the user to at least one available training action or at least one combination of available training actions, based on the sensed at least one user action, if the sensed at least one user action indicates a need for the user to be trained and at least one relevant training action from the set of available training actions is identified; and delivering, using the computer system, the selected at least one training action to the user.
In another embodiment, the present invention includes a computer-implemented training system. In that embodiment, the computer-implemented computer system includes a sensor monitoring at least one action performed by the user, an output device proximate to the user, and a computer system that includes at least one processor. The computer system is coupled to the sensor and the output device and the computer system contains instructions which, when executed by the at least one processor, causes the computer system to receive data from the sensor, the data pertaining to the performance by the user of an action, analyze the data using a training needs model that estimates at least one of a cost and a benefit of exposing the user to at least one training action, based on the received data, if the data indicates a need for the user to be trained, select one or more training actions from a collection of training actions for use by the user, and provide the selected one or more training actions to the user through the output device.
Other embodiments, which may include one or more parts of the aforementioned system or method, are also contemplated, and may thus have a broader or different scope than the aforementioned system or method. Thus, the embodiments in this Summary of the Invention are mere examples, and are not intended to limit or define the scope of the invention or claims.
Accordingly, the present invention provides solutions to the shortcomings of prior training systems and methods. Those of ordinary skill in training will readily appreciate, therefore, that those details described above and other details, features, and advantages of the present invention will become further apparent in the following detailed description of the preferred embodiments of the invention.
The accompanying drawings, which are incorporated herein and constitute part of this specification, and wherein like reference numerals are used to designate like components, include one or more embodiments of the invention and, together with a general description given above and a detailed description given below, serve to disclose principles of embodiments of behavior sensitive training.
In the following description, the present invention is set forth in the context of various alternative embodiments and implementations involving context-aware training systems, apparatuses, and methods. It will be appreciated that these embodiments and implementations are illustrative and various aspects of the invention may have applicability beyond the specifically described contexts. Furthermore, it is to be understood that these embodiments and implementations are not limited to the particular compositions, methodologies, or protocols described, as these may vary. The terminology used in the following description is for the purpose of illustrating the particular versions or embodiments only, and is not intended to limit their scope in the present disclosure which will be limited only by the appended claims.
Throughout the specification, reference to “one embodiment,” “an embodiment,” or “some embodiments” means that a particular described feature, structure, or characteristic is included in at least one embodiment. Thus appearances of the phrases “in one embodiment,” “in an embodiment,” or “in some embodiments” in various places throughout this specification are not necessarily all referring to the same embodiment. Those skilled in the art will recognize that the various embodiments can be practiced without one or more of the specific details or with other methods, components, materials, etc. In other instances, well-known structures, materials, or operations are not shown or not described in detail to avoid obscuring aspects of the embodiments. References to “or” are furthermore intended as inclusive, so “or” may indicate one or another of the ored terms or more than one ored term.
Various embodiments of context-aware training are directed to apparatuses, systems, and methods performing context-aware training. It will be appreciated by those skilled in the art, however, that a computer system may be assembled from any combination of devices with embedded processing capability, for example, computer, smart phone, tablet or other devices, including mobile or pervasive computing devices or appliances, electromechanical devices, and the like. The computer system can be configured to identify training interventions (or “training actions”) relevant to individual users and push those training interventions to users, both pro-actively (in anticipation of future needs) or reactively (in response to a need as it arises).
Numerous specific details are set forth in the specification and illustrated in the accompanying drawings to provide an understanding of the overall structure, function, manufacture, and use of embodiments of context-aware training. It will be understood by those skilled in the art, however, that the invention may be practiced without the specific details provided in the described embodiments. In other instances, well-known operations, components, and elements have not been described in detail so as not to obscure the embodiments described in the specification. Those of ordinary skill in the art will understand that the embodiments described and illustrated herein are non-limiting examples, and thus it can be appreciated that the specific structural and functional details disclosed herein may be representative and do not necessarily limit the scope of the embodiments, the scope of which is defined by the appended claims.
The one or more sensors 2 monitor one or more aspects of a user's behavior or activities (“user actions”). Those user actions may include sensing the behavior of people other than the user (regardless of whether they are a user of the system), the behavior of other entities (e.g. organisms, organization, the environment) with which a given user interacts (e.g. sensing how they respond to actions by the user), and other relevant contextual attributes. Those sensors 2 as well as other elements of the training system may be operated by one or more entities and may be deployed across a wide range of geographies, including different jurisdictional boundaries.
Behavior or activity data 3 may be recorded over time in one or more data storage devices 1012 (shown in
Behavior or activity data 3 may further be used in combination with historical user training data 4 which may be stored in one or more data storage devices 1012 and may include data related to the training one or more users have taken in the past. Historical user training data 4 may include information including when, and how well one or more users performed in prior training or assessments. For example, static user profiles 5 which may include a role of one or more individual user in the organization, their education levels, or demographic information for example, and may be stored in one or more data storage devices 1012, may be used in combination with the historic user training data.
Training needs models 6 may be stored in one or more data storage devices 1012 and may correlate one or more behaviors or activities with training that is relevant to those behaviors or activities. Training needs models 6 may be qualitative or quantitative in nature, and may include a mixture of both qualitative and quantitative aspects. Training needs models 6 may vary in complexity, ranging from simple “if-then” rules, for example, that map patterns of sensed data with training content typically required by people whose activity or behavior matches a given pattern, to more complex quantitative models that, for example, take into account considerations such as the probability that a user requires some type of training, the time it takes to take the training, the relative effectiveness of available training modules in addressing a training need, the type of a training a given user has taken in the past, the amount of time available to train the user and more.
A policy manager 7, which may be stored in one or more data storage devices 1012, may include instructions that may be executed by a processor. In one embodiment, the policy manager 7, is in charge of analyzing user behavior data 3, possibly in combination with information such as: (a) historical user training data 4 for the user, other similar users, or both, (b) static profile data 5 such as the role of the user and the education level of the user. The policy manager 7 analysis is conducted in light of one or more relevant training needs models 6. The policy manager 7 selects at least one training intervention 11 from an extensible collection of training interventions 11 (“context-aware training content”) to be pushed or provided to the user 12.
Training content data 8 may be organized in the form of an extensible collection of training modules 10 and training meta data 9. The extensible collection of training modules 10 may range from very short training interventions intended to be delivered in a just-in-time fashion, to longer, more extensive training modules that users may be encouraged or required to be taken within a predetermined period of time. Training interventions 10 along with relevant training meta-data 9 may be stored in one or more data storage devices 1012. Relevant training meta-data 9 for a training intervention may include information about the training needs the training intervention is designed to address, the format in which the training intervention can be delivered, the amount of time the training intervention typically requires, estimated effectiveness of the training intervention (possibly across all users or possibly for a subset of users based on considerations such as level of education, age, gender, prior training to which the users have been exposed) and other relevant considerations. The training meta-data 9 may include annotations and those annotations may be used by a policy manager 7 to select training content that is most appropriate for one or more users and when to provide that training content to the user or user group. Some training interventions may also be customizable based on relevant contextual information, such as the activities the user is engaged in, time available to train the user, available devices to deliver the content, preferred user language, demographic information and other contextual information.
The extensible collection of training interventions can change over time. For example, the extensible collection of training interventions may have training interventions deleted, added or modified. The training interventions can also be provided by different sources including, for example, corporate training developed in-house, external training interventions provided by vendors, training interventions obtained via personal subscriptions, and training interventions offered by service providers such as a doctor, a dietician, or a health club. In addition to the possibility that training interventions may vary over time, available sensors and other sources of contextual information may also vary over time. For example, a user may acquire a new mobile phone with additional sensors, new data about the user may be collected by a new source, and a new source of data may become able to interface with the context-aware training system.
Sensed data about user behavior and activities can include activities conducted in cyber space, activities in the physical world or a combination thereof. Sensed data may include any activity or behavior that can be tracked, observed, or recorded in some manner, for example, driving behavior, table manners, physical, mental and social health-related activities and habits, professional activities, social activities, etc. Sensed data may also include data relating to the behavior of people (not necessarily users of the system) with whom the user interacts in some manner. For example, sensed data may include responses received by the user from people, organisms, objects, surrounding elements or other entities with whom the user interacts, whether directly or indirectly.
Sensed data may also be provided by a system administrator via an administrator client 1014. Sensed data could include information such as the scheduled deployment of corporate smart phones. Such sensed data, when processed by the policy manager 7 based on training needs models, can help anticipate the need to train employees in the area of smart phone security and can result in the assignment of smart phone security training interventions to those employees.
One or more sensors 2 can include one or more devices, artifacts or other sources of information. For example, sensors 2 can include hardware, software, electromechanical devices, bio-sensory devices, and sources of information provided by third parties. Sensors 2 can be used to sense one or more aspects of a user's activities or behavior, whether in the context of routine activities or in response to artificially created situations (e.g. a mock situation or exercise created to evaluate a user's response). The sensors 2 can be embedded in or interfacing with smart phones, laptop computers, desktops, tablets, e-readers, body parts, or any other devices, appliances or elements of the user's local or global environment (e.g. smart home, smart car, smart office, or other mobile or pervasive computing device or appliance, including medical devices, water quality sensors, surveillance cameras, and other environmental sensors). The sensor 2 can include a data storage device or processor, for example in microprocessor form, and can obtain data provided by the user, by people other than the user, by organizations, or by entities including colleagues, friends, family members, strangers, doctors. The sensor 2 can alternately or in addition obtain data provided by systems (including data aggregated and synthesized from multiple sources, including aerial sensors, space-based sensors, implanted devices, and medical devices). For example, the sensor 2 can sense calendar information, status updates on social networks, and credit card transactions and can sense information or actions obtained through video surveillance. The sensor 2 can also sense a combination of data.
User behavior data 3 can be captured and recorded in one or more locations and may include relevant statistics, such as frequency associated with different types of events or situations, trends, and comparisons against relevant baselines. Such user behavior data 3 may help create a unique profile for each individual user that captures this user's activities and behaviors at a particular point in time or over different periods of time.
Historical user training data 4 may inform the selection of relevant training for a user by capturing the training history of that user. Historical user training data 4 may, include the training modules to which that user has already been exposed, how often and when that user was exposed to training modules, how well the user responded when taking the training modules, and other indicators of the user's proficiency in the area or areas in which the user has been trained. User proficiency can include, for example, recorded instances where the user failed to conform to expected best practices or apply relevant knowledge covered by the training system.
An example of a domain that can benefit from sensing user behavior is cyber security training and awareness for everyday users. The complexity of today's computers, including cell phones, tablets and other computer-powered or Internet-enabled devices, and networking systems make them vulnerable to an ever-wider range of attacks. Human users who adopt best practices and strategies (e.g. not falling for Internet-enabled social engineering attacks, regularly checking and installing software patches, adopting safe browsing practices, safe USB memory practices, safe password management practices, etc.) can often help reduce their exposure to many of those threats. Training everyday users to adopt improved strategies that address potential threats can be a daunting task. Accordingly, an effective way to mitigate risks is to prioritize training for individual users based on the threats to which they are most likely to be exposed by taking into account information about user activities or behaviors and/or other relevant contextual attributes such as their prior training history and level of expertise.
Examples of behavior or activity sensors 14 in the cyber security training domain include sensors that detect attachments in emails sent or received by a user, sensors to determine whether one or more users access different services over secure connections, sensors to identify the number, type and/or identity of applications installed on a user's mobile phone, and sensors to track the locations, including Internet web pages, a user visits. Sensors 14 can also include, for instance, sensors to detect USB key usage, record browsing history, identify Bluetooth headset use, sensors that detect the number or types of emails received, sensors that inspect the content of emails, and sensors that track the physical location of users.
In this domain, one embodiment of the invention includes a policy manager 19, which may be performed by a processor, such as a processor that is part of an analysis host computer 1010 (illustrated in
The policy manager 19 may operate autonomously or according to a mixed initiative mode. In a mixed initiative mode, a system administrator (e.g. a security analyst, a member of human resources in charge of training, or some other role in an organization) uses an administrator client 1014 to interact with the policy manager (e.g., 19 in the embodiment illustrated in
In general different training interventions may utilize different delivery devices, some just with output capability, others with different combinations of output and input functionality.
The system may include a storage system 1012, which may comprise a plurality of storage devices, including cloud-based devices, possibly located across a plurality of locations. The storage system 1012 may serve as repository for static user data 5, recorded data collected from one or more sensors 2, historical user training data 4, and training needs models 6. The storage system 1012 may also store part or all of the training content 10 and training meta-data 11 available to the context-aware training system.
The computers 1002, 1003, 1007, 1010 and other devices 1005, 1006 and artifacts 1008, 1013 may be computers or computer systems as described above and may each include at least one processor and possibly one or more other components of a computer or network of computers. For example, the analysis host computer 1010 may be a single server or could be a distributed computing platform or a cloud-based system running software such as Microsoft WINDOWS, Linux or UNIX. The client configuration, participant computers, which may include one or more laptops 1003, tablets 1002, smart phones 1007, administrator devices 1014 or output devices 1013, may themselves comprise a collection of participant computers capable of network connectivity. Those devices 1002, 1003, 1007, 1013, and 1014 may support any number of input and output functions. Those input and output functions may be embedded in the devices themselves or may be provided by satellite hardware such as a keyboard, mouse, display, or speaker. Devices may be connected to the network either through a physical hardwire connection or through wireless technology such as 802.11 WiFi, BLUETOOTH, near field communication (NFC), or GSM/CDMA/LTE cellular networks, or through other communication methods or systems. The operating system of each participant computer could include Microsoft WINDOWS, Linux, UNIX, Mac OSX, Android ANDROID, iOS, PALM, or another operating system. When relevant the computers 1002, 1003, 1007, 1013, and 1014 may run browser software such as, for example, MOZILLA, INTERNET EXPLORER (IE), SAFARI, CHROME or another browser software or browsing methodology. The type and configuration of the participant computers (e.g. 1002, 1003, 1007, 1010) can be otherwise configured as desired.
The communication networks 1009 could be any type of data or computer communication network or any other technology enabling computers and possibly other devices or appliances to communicate with one another.
In one embodiment, the methods discussed herein with respect to
One embodiment of a method of context-aware training that may be performed, for example, by one or more of the components illustrated in
The user action process includes detecting an interaction event at 110. When detecting an interaction event at 110 in this embodiment, a sensor 2 detects the interaction event, which corresponds to user activities or behaviors or, more generally, other contextual attributes relevant to the training available. Such contextual attributes may include any relevant sensory data as well as information obtained from other relevant sources of information, such as browser history, credit card records, surveillance cameras, electronic doors, employment records, information collected about a person with which the user has interacted, and social networking information. In one instance, a software or executable program will run on a participant computer or device (e.g. 1002, 1003, 1005, 1006, 1007, 1008) and locally process sensed data to detect one or more relevant interaction events prior to forwarding the detected information (e.g. in the form of interaction signatures) to a storage system 1012. In some embodiments, user data 3 can be forwarded directly to the analysis host computer 1010. The storage system may be responsible, among other things, for storing sensed user data 3. Detecting an interaction event 110 may include filtering sensed data, aggregation of sensed data, pre-processing of the sensed data, analysis of the sensed data, and/or generation of one or more event signatures 120.
The “interaction” does not have to be known to the user. Rather the term interaction here is intended to also include the detection of a particular behavior or activity by one or more sensors. This could include a browser-based sensor to detect that a user visits potentially harmful websites or a camera coupled with a computer vision program to detect a particular activity or behavior (e.g. driver drowsiness or bad table manners).
The user action process may include generating an interaction signature at 120, though in some embodiments raw sensor data may be stored, as shown at 130, or directly forwarded to the analysis host computer 1010. The interaction signature can be produced in various ways including using cryptographic hash functions. In some embodiments, sources of sensory data may forward sensed information to one or more other participant computers shown or not shown in
The interaction signature, sensed information and, when appropriate, the identity of the user to which the interaction signature corresponds, may be forwarded to a storage system 1012 responsible, among other things, for storing sensed user data 3 at 130. In other embodiments of the method of context-aware training, sensed information may be directly communicated to an analysis host computer 1010 responsible for hosting the policy manager 7 functionality enabling the policy manager 7 to immediately analyze the sensed information based on relevant training needs models 6.
The policy management process 140 includes initiating training analysis at 150 and, when appropriate, identifying one or more relevant training interventions from a collection of available training interventions, including possibly just-in-time training interventions. The policy manager 7 is responsible for determining, and possibly prioritizing, the training content to be pushed to individual users. The policy manager 7, in this embodiment initiates a training analysis process 150 for one or more users and collecting relevant user data 160 that may be beneficial in conducting the training analysis 150. Gathering user data 160 may include accessing static user data and sensed user data. Sensed user data may include relevant contextual data, whether obtained directly from a sensing device 2 or participant computer, or whether obtained from parts of a storage system storing sensed user data. Gathering user data 160 may also include retrieving relevant historical training data 4, retrieving relevant training needs models 6 (to the extent that they are not stored locally on the analysis host computer 1010), and/or retrieving training meta-data 9 about available training interventions. The Policy Manager 7 applies training needs models 6 to determine which training interventions to push to the user and, when relevant, how to prioritize these training interventions.
Embodiments of the policy manager 7 may operate according to one or more modes. Those policy manager modes include scheduled modes, routine modes, real-time modes, mixed-initiative modes and combinations thereof. In an embodiment of context aware training in which a scheduled mode is utilized, the policy manager 7 regularly assesses the overall training needs of a plurality of individual users and reprioritizes training content to be pushed or delivered to each individual user. In some embodiments, that process may be fully automated. In other embodiments, that process may follow a mixed-initiative mode, where an administrative user (e.g. a system administrator, a member of personnel in charge of training, an analyst or some other suitable person, including possibly the user himself) reviews, via an administrator client 1014, analysis results produced by the policy manager (i.e., 7 in
Regular assessment of user training needs may involve running in batch mode, where all users are being reviewed in one batch or where different groups of users are processed in different batches, possibly according to different schedules. Regular assessment of user training needs may also include pushing short security quizzes and creating mock situations aimed at better evaluating the needs of an individual user or a group of users. In a real-time mode, the policy manager 7 operates in an event-driven manner enabling it to more rapidly detect changes in user behavior or activities and other relevant contextual attributes, and to more quickly push training interventions that reflect the risks to which the user is exposed at a desired time. Any of those modes can be implemented in the form of simple rules or more complex logic that can potentially be customized and refined by an organization where, for instance, the organization is using administrator client software interfaces 1014. The rules or more complex logic can also be defined to allow for mixed initiative iterations with system administrators and users, where results from the analysis performed by the policy manager 7 are shown to the user and the user can interact with the policy manager 7 to refine the analysis, evaluate different options, and possibly finalize the selection, prioritization and scheduling of training interventions, whether for individual users or groups of users. The rules and/or logic may be manually configured by system administrators, programmers or other qualified personnel (whether working for the organization providing the context-aware training system, for a customer organization, for a contractor working for either of those organizations, or by some other individual or group of individuals) or derived through statistical analysis or data mining techniques, or a combination of both. The administrator client software interface may also allow administrators to maintain and customize training needs models and other relevant parameters, data elements and elements of functionality of the context-aware training system. Maintenance and customization may include updating and customizing the collection of available training interventions, and updating and customizing individual training interventions, including associated meta-data (e.g. pre-requisites, compatible delivery platforms, required time, effectiveness and other meta-data). Maintenance and customization may also include accessing, reviewing and manipulating other relevant system data, including static user data, sensed user data, historical training data, and other meta-data.
Once relevant training interventions have been identified by the policy manager 7 for one or more users, those interventions may be delivered or pushed to the user at 180. Delivery of training interventions, which may include training content, may be performed in a number of ways, including sending relevant training interventions directly to one or more output devices capable of delivering the identified interventions to the user. Delivering training interventions may also be performed by updating a schedule indicating when training interventions should be delivered or otherwise exposed to the user, or updating a schedule that will be exposed to the user, possibly with a combination of required and recommended training content for engagement by the user. Training interventions may include one or more dates by which the user should experience the training intervention, proficiency levels that may have to be achieved by the user while engaging with the training content (e.g. training quiz, training game, simulation exercise, responses to mock situations and other interactive types of interventions). Training interventions may also be performed through a combination of types of interventions including, for example, a delivery of a combination of just-in-time training interventions to the user, training assignments to be completed by the user by assigned dates or times, and recommendations for further training of the user. Training intervention, including training content, assignments, and recommendations, may also be provided to the user by other relevant means.
Training interventions may include the creation of mock situations, whether through fully automated processes (e.g. automated delivery of SMS phishing messages to a number of users), or manual processes (e.g. activating personnel responsible for creating mock situations such as mock impersonation phone calls intended to train people not to fall for social engineering attacks), or hybrid processes (e.g. mock USB memory attack, where a USB includes fake malware intended to train one or more users not to plug USB memory sticks into a computer and further wherein such USB memory devices are manually scattered around an office to lure employees to pick them up). Training interventions may come in many different formats, ranging from video and audio content, to cartoons, alerts (e.g. alarms, flashing lights), training interventions involving personnel (e.g. a phone call from the boss of a user, a training session with a certified instructor, a conversation with the parent of a user, a session with a dietician), or any combination of the above or any other relevant format by which training content may be delivered to a user.
In the response process 185, as users engage with the training interventions 190, their responses may be recorded in part or in whole 200. That response data itself may be analyzed in real-time by the policy manager 7 or may be stored in an appropriate format, possibly for later analysis, (whether in raw form or in summarized form) in a part of the storage system 1012 responsible for storing historical training data or in a part of the storage system responsible for storing user behavior data 3, or some other relevant storage, or any combination of the above. Response data may include whether the user experiences the training, when the user experiences the training, how long the user takes to experience the training, whether the user's behavior changes after taking the training, the level of proficiency exhibited by the user while taking the training (e.g. in the case of an interactive training module), changes in the behaviors or responses of people the user interacts with after taking the training, or any other relevant data.
In the case of an embodiment of a context-aware cybersecurity training system, sensed user data 3 is analyzed to identify threat scenarios for which a user in a given context is most susceptible or most at risk.
An embodiment of a partial training needs model 6 based on simple threshold levels is illustrated in
A user may be identified as being at high risk for a number of different possible threat scenarios. In one embodiment, the policy manager 7 is responsible for consolidating the training needs identified for the user and for identifying a suitable and possibly prioritized collection of training actions, based on considerations such as the collection of training interventions available for addressing the collection of training needs identified by the model.
Some training interventions can address more than one training need. For instance a smart phone security training module may address both smart phone security at large as well as phishing emails in the context of smart phones. Training actions selected by the policy manager may include immediate, just-in-time training interventions, assignments of training interventions the user should take by a certain date, and recommendations for additional training.
Elements of an embodiment of a slightly more complex training needs model 4000 based on risk models is illustrated in
The particular format of the model shown in
In another embodiment, a computer-implemented training system is contemplated in which a user computing device (i.e., 1002, 1003, 1005, 1006, 1007, and 1008 illustrated in
In another embodiment in which a user computing device (i.e., 1002, 1003, 1005, 1006, 1007, and 1008 illustrated in
The user in embodiments of context-aware training could be a human user or, for example, a robot, a cyber entity, an organism, an organization, a trainable entity, or a group or subset of those users. Examples of cyber entities include intelligent agents, such as Siri on the iPhone, an avatar in a virtual environment, or a character in a computer game.
Examples of the training interventions and meta-data described in
While specific embodiments of the invention have been described in detail, it should be appreciated by those skilled in the art that various modifications and alternations and applications could be developed in light of the overall teachings of the disclosure. Accordingly, the particular arrangements, systems, apparatuses, and methods disclosed are meant to be illustrative only and not limiting as to the scope of the invention.
Sadeh-Koniecpol, Norman, Wescoe, Kurt, Hong, Jason, Brubaker, Jason
Patent | Priority | Assignee | Title |
10158599, | Feb 25 2015 | International Business Machines Corporation | Blinder avoidance in social network interactions |
10165006, | Jan 05 2017 | KnowBe4, Inc. | Systems and methods for performing simulated phishing attacks using social engineering indicators |
10187407, | Feb 08 2013 | BLUE TORCH FINANCE LLC | Collaborative phishing attack detection |
10581912, | Jan 05 2017 | KnowBe4, Inc. | Systems and methods for performing simulated phishing attacks using social engineering indicators |
10701024, | Feb 25 2015 | International Business Machines Corporation | Blinder avoidance in social network interactions |
10771485, | Jul 12 2018 | Bank of America Corporation | Systems and methods for cross-channel electronic communication security with dynamic targeting |
10819744, | Feb 08 2013 | BLUE TORCH FINANCE LLC | Collaborative phishing attack detection |
10924517, | Feb 07 2018 | Sophos Limited | Processing network traffic based on assessed security weaknesses |
10986122, | Aug 02 2016 | Sophos Limited | Identifying and remediating phishing security weaknesses |
10992701, | Nov 20 2018 | Bank of America Corporation | Systems and methods for dynamic targeting of secure repurposed cross-channel electronic communications |
11070587, | Jan 05 2017 | KnowBe4, Inc. | Systems and methods for performing simulated phishing attacks using social engineering indicators |
11138902, | Jan 08 2015 | Lawrence Livermore National Security, LLC | Incident exercise in a virtual environment |
11195131, | May 09 2018 | Microsoft Technology Licensing, LLC | Increasing usage for a software service through automated workflows |
11601470, | Jan 05 2017 | KnowBe4, Inc. | Systems and methods for performing simulated phishing attacks using social engineering indicators |
11936688, | Jan 05 2017 | KnowBe4, Inc. | Systems and methods for performing simulated phishing attacks using social engineering indicators |
9674221, | Feb 08 2013 | BLUE TORCH FINANCE LLC | Collaborative phishing attack detection |
9882861, | Feb 25 2015 | International Business Machines Corporation | Blinder avoidance in social network interactions |
9906539, | Apr 10 2015 | BLUE TORCH FINANCE LLC | Suspicious message processing and incident response |
9906554, | Apr 10 2015 | BLUE TORCH FINANCE LLC | Suspicious message processing and incident response |
ER4970, | |||
ER9772, |
Patent | Priority | Assignee | Title |
6324647, | Aug 31 1999 | Accenture Global Services Limited | System, method and article of manufacture for security management in a development architecture framework |
6634887, | Jun 19 2001 | CARNEGIE LEARNING, INC | Methods and systems for tutoring using a tutorial model with interactive dialog |
6954858, | Dec 22 1999 | Advanced Cardiovascular Systems, INC | Computer virus avoidance system and mechanism |
7092861, | Nov 02 2000 | Meta Platforms, Inc | Visual anti-virus in a network control environment |
7325252, | May 18 2001 | ALERT LOGIC, INC | Network security testing |
7457823, | May 02 2004 | OpSec Online Limited | Methods and systems for analyzing data related to possible online fraud |
7486666, | Jul 28 2005 | Symbol Technologies, LLC | Rogue AP roaming prevention |
7761618, | Mar 25 2005 | Microsoft Technology Licensing, LLC | Using a USB host controller security extension for controlling changes in and auditing USB topology |
8046374, | May 06 2005 | CA, INC | Automatic training of a database intrusion detection system |
8063765, | Feb 01 2008 | Apple Inc. | Consumer abuse detection system and method |
8146164, | Jan 24 2006 | VERODY, LLC | Method and apparatus for thwarting spyware |
8205255, | May 14 2007 | Cisco Technology, Inc | Anti-content spoofing (ACS) |
8220047, | Aug 09 2006 | GOOGLE LLC | Anti-phishing system and method |
8255393, | Aug 07 2009 | GOOGLE LLC | User location reputation system |
8266320, | Jan 27 2005 | Leidos, Inc | Computer network defense |
8321945, | Jun 05 2007 | HITACHI SOLUTIONS, LTD | Security measure status self-checking system |
8341691, | Sep 03 2004 | Fortinet, INC | Policy based selection of remediation |
8402528, | Aug 09 2004 | CA, INC | Portable firewall adapter |
8423483, | May 16 2008 | Proofpoint, Inc | User-controllable learning of policies |
8457594, | Aug 25 2006 | Qwest Communications International Inc | Protection against unauthorized wireless access points |
8464346, | May 24 2007 | Synopsys, Inc | Method and system simulating a hacking attack on a network |
8468244, | Jan 05 2007 | DIGITAL DOORS, INC | Digital information infrastructure and method for security designated data and with granular data stores |
8468279, | Mar 31 2009 | Intel Corporation | Platform based verification of contents of input-output devices |
8478860, | Mar 14 2006 | RPX Corporation | Device detection system for monitoring use of removable media in networked computers |
8495700, | Feb 28 2005 | JPMORGAN CHASE BANK, N A , AS ADMINISTRATIVE AGENT | Mobile data security system and methods |
8532970, | Jun 18 2008 | Metova Federal, LLC | Systems and methods for network monitoring and analysis of a simulated network |
8533847, | May 24 2007 | Western Digital Israel Ltd | Apparatus and method for screening new data without impacting download speed |
8560864, | Mar 26 2008 | FEGO Precision Industrial Co., Ltd.; FEGO PRECISION INDUSTRIAL CO , LTD | Firewall for removable mass storage devices |
8561134, | Sep 03 2004 | Fortinet, INC | Policy-based selection of remediation |
8608487, | Nov 29 2007 | Bank of America Corporation | Phishing redirect for consumer education: fraud detection |
8615807, | Feb 08 2013 | Cofense Inc | Simulated phishing attack with sequential messages |
8635703, | Feb 08 2013 | Cofense Inc | Performance benchmarking for simulated phishing attacks |
8646028, | Dec 14 2009 | Citrix Systems, Inc. | Methods and systems for allocating a USB device to a trusted virtual machine or a non-trusted virtual machine |
8656095, | Feb 02 2010 | LEGAL DIGITAL SERVICES, LLC D B A SKOUT FORENSICS | Digital forensic acquisition kit and methods of use thereof |
8683588, | Jun 29 2011 | The Boeing Company | Method of and apparatus for monitoring for security threats in computer network traffic |
8707180, | Aug 17 2009 | The Board of Trustees of the University of Illinois | System for cyber investigation and data management |
8719925, | Aug 25 2009 | National Technology & Engineering Solutions of Sandia, LLC | Content-addressable memory based enforcement of configurable policies |
8719940, | Feb 08 2013 | BLUE TORCH FINANCE LLC | Collaborative phishing attack detection |
8751629, | Jun 18 2008 | Metova Federal, LLC | Systems and methods for automated building of a simulated network environment |
8763126, | Dec 08 2010 | AT&T Intellectual Property I, L.P.; AT&T Intellectual Property I, L P | Devices, systems, and methods for detecting proximity-based mobile propagation |
8769684, | Dec 02 2008 | The Trustees of Columbia University in the City of New York | Methods, systems, and media for masquerade attack detection by monitoring computer user behavior |
8776170, | Sep 03 2004 | Fortinet, INC | Policy-based selection of remediation |
8782745, | Aug 25 2006 | Qwest Communications International Inc | Detection of unauthorized wireless access points |
8793795, | Jan 28 2005 | Intelligent Computer Solutions, Inc. | Computer forensic tool |
8793799, | Nov 16 2010 | Booz, Allen & Hamilton | Systems and methods for identifying and mitigating information security risks |
8819825, | May 31 2006 | The Trustees of Columbia University in the City of New York | Systems, methods, and media for generating bait information for trap-based defenses |
8819858, | Jun 03 2008 | CA, Inc. | Hardware access and monitoring control |
8914846, | Sep 03 2004 | Fortinet, Inc. | Policy-based selection of remediation |
8918872, | Jun 27 2008 | JPMORGAN CHASE BANK, N A , AS ADMINISTRATIVE AGENT | System, method, and computer program product for reacting in response to a detection of an attempt to store a configuration file and an executable file on a removable device |
8931101, | Nov 14 2012 | International Business Machines Corporation | Application-level anomaly detection |
8943554, | Mar 18 2011 | Smith Micro Software, Inc | Managing tethered data traffic over a hotspot network |
8966637, | Feb 08 2013 | BLUE TORCH FINANCE LLC | Performance benchmarking for simulated phishing attacks |
8978151, | Aug 23 2011 | Removable drive security monitoring method and system | |
9009835, | Aug 06 2010 | SAMSUNG SDS CO., LTD. | Smart card, anti-virus system and scanning method using the same |
9015789, | Mar 17 2009 | DEUTSCHE BANK AG NEW YORK BRANCH, AS COLLATERAL AGENT | Computer security lock down methods |
9031536, | Apr 02 2008 | BLANCCO TECHNOLOGY GROUP IP OY | Method for mitigating the unauthorized use of a device |
9069961, | Mar 31 2009 | Intel Corporation | Platform based verification of contents of input-output devices |
9076132, | Nov 07 2003 | EMC IP HOLDING COMPANY LLC | System and method of addressing email and electronic communication fraud |
9118665, | Apr 18 2007 | DATA LOCKER INC | Authentication system and method |
9141792, | Nov 14 2012 | International Business Machines Corporation | Application-level anomaly detection |
9154523, | Sep 03 2004 | Fortinet, Inc. | Policy-based selection of remediation |
9215250, | Aug 20 2013 | JANUS TECHNOLOGIES, INC | System and method for remotely managing security and configuration of compute devices |
9280911, | Apr 08 2011 | Proofpoint, Inc | Context-aware training systems, apparatuses, and methods |
9330257, | Aug 15 2012 | Qualcomm Incorporated | Adaptive observation of behavioral features on a mobile device |
9367484, | Jun 01 2007 | System and apparatus for controlling use of mass storage devices | |
9373267, | Apr 08 2011 | Proofpoint, Inc | Method and system for controlling context-aware cybersecurity training |
9392024, | Sep 03 2004 | Fortinet, Inc. | Policy-based selection of remediation |
9426179, | Mar 17 2009 | DEUTSCHE BANK AG NEW YORK BRANCH, AS COLLATERAL AGENT | Protecting sensitive information from a secure data store |
20020091940, | |||
20050183143, | |||
20060037076, | |||
20060075024, | |||
20060253906, | |||
20070112714, | |||
20070180525, | |||
20070226796, | |||
20070245422, | |||
20070271613, | |||
20080052359, | |||
20080167920, | |||
20080222734, | |||
20080254419, | |||
20080288330, | |||
20090144308, | |||
20090158430, | |||
20090319906, | |||
20090320137, | |||
20100010968, | |||
20100146615, | |||
20100235918, | |||
20110167011, | |||
20110238855, | |||
20120124671, | |||
20130232576, | |||
20140115706, | |||
20140157405, | |||
20140165207, | |||
20140201836, | |||
20140230060, | |||
20140230061, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Apr 09 2012 | WOMBAT SECURITY TECHNOLOGIES, INC. | (assignment on the face of the patent) | / | |||
Apr 09 2012 | HONG, JASON | WOMBAT SECURITY TECHNOLOGIES, INC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 028015 | /0541 | |
Apr 09 2012 | BRUBAKER, JASON | WOMBAT SECURITY TECHNOLOGIES, INC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 028015 | /0541 | |
Apr 09 2012 | WESCOE, KURT | WOMBAT SECURITY TECHNOLOGIES, INC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 028015 | /0541 | |
Apr 09 2012 | SADEH-KONIECPOL, NORMAN | WOMBAT SECURITY TECHNOLOGIES, INC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 028015 | /0541 | |
Jan 23 2015 | WOMBAT SECURITY TECHNOLOGIES, INC | BRIDGE BANK, NATIONAL ASSOCIATION | SECURITY INTEREST SEE DOCUMENT FOR DETAILS | 044640 | /0360 | |
Oct 09 2015 | WOMBAT SECURITY TECHNOLOGIES, INC | ESCALATE CAPITAL PARTNERS SBIC III, LP | SECURITY INTEREST SEE DOCUMENT FOR DETAILS | 036794 | /0141 | |
Feb 23 2018 | WESTERN ALLIANCE BANK D B A BRIDGE BANK | WOMBAT SECURITY TECHNOLOGIES, INC | RELEASE BY SECURED PARTY SEE DOCUMENT FOR DETAILS | 056327 | /0658 | |
Feb 28 2018 | ESCALATE CAPITAL PARTNERS SBIC III, LP | WOMBAT SECURITY TECHNOLOGIES, INC | RELEASE BY SECURED PARTY SEE DOCUMENT FOR DETAILS | 045109 | /0696 | |
Mar 18 2019 | WOMBAT SECURITY TECHNOLOGIES, INC | Proofpoint, Inc | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 048632 | /0031 | |
Aug 31 2021 | Proofpoint, Inc | GOLDMAN SACHS BANK USA, AS COLLATERAL AGENT | SECOND LIEN INTELLECTUAL PROPERTY SECURITY AGREEMENT | 057389 | /0642 | |
Aug 31 2021 | Proofpoint, Inc | GOLDMAN SACHS BANK USA, AS COLLATERAL AGENT | FIRST LIEN INTELLECTUAL PROPERTY SECURITY AGREEMENT | 057389 | /0615 | |
Mar 21 2024 | GOLDMAN SACHS BANK USA, AS AGENT | Proofpoint, Inc | RELEASE OF SECOND LIEN SECURITY INTEREST IN INTELLECTUAL PROPERTY | 066865 | /0648 |
Date | Maintenance Fee Events |
Jul 14 2020 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Jul 09 2024 | M1552: Payment of Maintenance Fee, 8th Year, Large Entity. |
Date | Maintenance Schedule |
Jan 17 2020 | 4 years fee payment window open |
Jul 17 2020 | 6 months grace period start (w surcharge) |
Jan 17 2021 | patent expiry (for year 4) |
Jan 17 2023 | 2 years to revive unintentionally abandoned end. (for year 4) |
Jan 17 2024 | 8 years fee payment window open |
Jul 17 2024 | 6 months grace period start (w surcharge) |
Jan 17 2025 | patent expiry (for year 8) |
Jan 17 2027 | 2 years to revive unintentionally abandoned end. (for year 8) |
Jan 17 2028 | 12 years fee payment window open |
Jul 17 2028 | 6 months grace period start (w surcharge) |
Jan 17 2029 | patent expiry (for year 12) |
Jan 17 2031 | 2 years to revive unintentionally abandoned end. (for year 12) |