A method, computer program product, and system are disclosed for facilitating access by a first person to a secure region within an environment having a plurality of items, wherein the secure region is at least partly defined by an access control device. The method acquires, using at least one visual sensor disposed within the environment, first image information including the first person and the access control device. The method identifies the first person using image analysis performed on the first image information. Further, the method identifies, using image analysis a first behavior of the first person relative to the access control device. Upon determining the first behavior corresponds to a predefined visual access behavior, and the security level of the first person satisfies a predetermined threshold security level associated with the access control device, deactivating a security element to permit the first person to physically access the secure region.
|
1. A computer-implemented method of facilitating access by a first person to a secure region within an environment having a plurality of items, wherein the secure region is at least partly defined by an access control device, the method comprising:
acquiring, using at least one visual sensor disposed within the environment, first image information including the first person and the access control device;
identifying, using one or more computer processors coupled with the at least one visual sensor, the first person using image analysis performed on the first image information, wherein identifying the first person comprises associating the first person with a security level selected from a plurality of predetermined security levels;
identifying, using the image analysis, a first behavior of the first person relative to the access control device; and
upon determining that the first behavior corresponds to a predefined visual access behavior, and that the security level of the first person satisfies a predetermined threshold security level associated with the access control device, transmitting a first control signal to a security element of the access control device, thereby deactivating the security element and permitting the first person to physically access the secure region through the access control device,
wherein the predefined visual access behavior is selected from a group comprising one or more of:
(1) reaching toward a handle, knob, or lever of the access control device,
(2) performing a predefined gesture mimicking operating a handle, knob, or lever of a door while in proximity of the access control device,
(3) performing a predefined gesture mimicking opening an enclosure with dual doors while in proximity of the access control device,
(4) reaching toward the access control device,
(5) waving towards the access control device,
(6) performing a predefined hand gesture indicating approval while in proximity of the access control device, and
(7) mouthing a predefined word while in proximity of the access control device.
7. A system for facilitating access by a first person to a secure region within an environment having a plurality of items, wherein the secure region is at least partly defined by an access control device, the system comprising:
a plurality of visual sensors located throughout the environment;
a controller configured to:
acquire, using at least one visual sensor disposed within the environment, first image information including the first person and the access control device;
identify, using one or more computer processors coupled with the at least one visual sensor, the first person using image analysis performed on the first image information, wherein identifying the first person comprises associating the first person with a security level selected from a plurality of predetermined security levels;
identify, using the image analysis, a first behavior of the first person relative to the access control device; and
upon determining that the first behavior corresponds to a predefined visual access behavior, and that the security level of the first person satisfies a predetermined threshold security level associated with the access control device, transmit a first control signal to a security element of the access control device, thereby deactivating the security element and permitting the first person to physically access the secure region through the access control device,
wherein the predefined visual access behavior is selected from a group comprising one or more of:
(1) reaching toward a handle, knob, or lever of the access control device,
(2) performing a predefined gesture mimicking operating a handle, knob, or lever of a door while in proximity of the access control device,
(3) performing a predefined gesture mimicking opening an enclosure with dual doors while in proximity of the access control device,
(4) reaching toward the access control device,
(5) waving towards the access control device,
(6) performing a predefined hand gesture indicating approval while in proximity of the access control device, and
(7) mouthing a predefined word while in proximity of the access control device.
13. A computer program product to facilitate access by a first person to a secure region within an environment having a plurality of items, wherein the secure region is at least partly defined by an access control device, the computer program product comprising:
a non-transitory computer-readable storage medium having computer-readable program code embodied therewith, the computer-readable program code executable by one or more computer processors to perform an operation that includes:
acquiring, using at least one visual sensor disposed within the environment, first image information including the first person and the access control device;
identifying, using one or more computer processors coupled with the at least one visual sensor, the first person using image analysis performed on the first image information, wherein identifying the first person comprises associating the first person with a security level selected from a plurality of predetermined security levels;
identifying, using the image analysis, a first behavior of the first person relative to the access control device; and
upon determining that the first behavior corresponds to a predefined visual access behavior, and that the security level of the first person satisfies a predetermined threshold security level associated with the access control device, transmitting a first control signal to a security element of the access control device, thereby deactivating the security element and permitting the first person to physically access the secure region through the access control device,
wherein the predefined visual access behavior is selected from a group comprising one or more of:
(1) reaching toward a handle, knob, or lever of the access control device,
(2) performing a predefined gesture mimicking operating a handle, knob, or lever of a door while in proximity of the access control device,
(3) performing a predefined gesture mimicking opening an enclosure with dual doors while in proximity of the access control device,
(4) reaching toward the access control device,
(5) waving towards the access control device,
(6) performing a predefined hand gesture indicating approval while in proximity of the access control device, and
(7) mouthing a predefined word while in proximity of the access control device.
2. The computer-implemented method of
acquiring, using at least one visual sensor disposed within the environment, second image information including the first person and the access control device;
identifying, using image analysis performed on the second image information, a second behavior of the first person relative to the access control device; and
upon determining, based on the second behavior, that the first person is no longer physically accessing the secure region, transmitting a second control signal to the security element of the access control device, thereby activating the security element to prevent subsequent unauthorized attempts to physically access the secure region.
3. The computer-implemented method of
accessing known persons data stored in a computer-readable memory coupled with the one or more computer processors, wherein the known persons data identifies a plurality of known persons having a predefined association with the environment; and
determining, based on the known persons data, that the plurality of known persons includes the first person,
wherein associating the first person with a security level comprises determining the security level based on the known persons data.
4. The computer-implemented method of
accessing known persons data stored in a computer-readable memory coupled with the one or more computer processors, wherein the known persons data identifies a plurality of known persons having a predefined association with the environment; and
determining, based on the known persons data, that the plurality of known persons does not include the first person,
wherein associating the first person with a security level comprises determining the security level based on one or more characteristics of the first person visually determined using the image analysis.
5. The computer-implemented method of
wherein the security level is based at least in part on the determined class of the first person.
6. The computer-implemented method of
upon determining that the predefined visual access behavior corresponds to a predefined emergency behavior indicating the first person is in distress, communicating an emergency message.
8. The system of
acquire, using at least one visual sensor disposed within the environment, second image information including the first person and the access control device;
identify, using image analysis performed on the second image information, a second behavior of the first person relative to the access control device; and
upon determining, based on the second behavior, that the first person is no longer physically accessing the secure region, transmit a second control signal to the security element of the access control device, thereby activating the security element to prevent subsequent unauthorized attempts to physically access the secure region.
9. The system of
accessing known persons data stored in a computer-readable memory coupled with the one or more computer processors, wherein the known persons data identifies a plurality of known persons having a predefined association with the environment; and
determining, based on the known persons data, that the plurality of known persons includes the first person,
wherein associating the first person with a security level comprises determining the security level based on the known persons data.
10. The system of
accessing known persons data stored in a computer-readable memory coupled with the one or more computer processors, wherein the known persons data identifies a plurality of known persons having a predefined association with the environment; and
determining, based on the known persons data, that the plurality of known persons does not include the first person,
wherein associating the first person with a security level comprises determining the security level based on one or more characteristics of the first person visually determined using the image analysis.
11. The system of
wherein the security level is based at least in part on the determined class of the first person.
12. The system of
upon determining that the predefined visual access behavior corresponds to a predefined emergency behavior indicating the first person is in distress, communicate an emergency message.
14. The computer program product of
acquiring, using at least one visual sensor disposed within the environment, second image information including the first person and the access control device;
identifying, using image analysis performed on the second image information, a second behavior of the first person relative to the access control device; and
upon determining, based on the second behavior, that the first person is no longer physically accessing the secure region, transmitting a second control signal to the security element of the access control device, thereby activating the security element to prevent subsequent unauthorized attempts to physically access the secure region.
15. The computer program product of
accessing known persons data stored in a computer-readable memory coupled with the one or more computer processors, wherein the known persons data identifies a plurality of known persons having a predefined association with the environment; and
determining, based on the known persons data, that the plurality of known persons includes the first person,
wherein associating the first person with a security level comprises determining the security level based on the known persons data.
16. The computer program product of
accessing known persons data stored in a computer-readable memory coupled with the one or more computer processors, wherein the known persons data identifies a plurality of known persons having a predefined association with the environment; and
determining, based on the known persons data, that the plurality of known persons does not include the first person,
wherein associating the first person with a security level comprises determining the security level based on one or more characteristics of the first person visually determined using the image analysis.
17. The computer program product of
wherein the security level is based at least in part on the determined class of the first person.
|
This application is related to U.S. patent application Ser. No. 15/134,185, filed Apr. 20, 2016 and entitled “ASSOCIATING MULTIPLE PERSONS WITH A VIRTUAL TRANSACTION IN AN ENVIRONMENT”. The related application is herein incorporated by reference in its entirety.
The present disclosure relates to an environment having a plurality of visual sensors disposed therein, and more specifically, to accessing a secure region of the environment using visually identified behaviors relative to an access control device at least partly defining the secure region.
So that the manner in which the above recited features of the present disclosure can be understood in detail, a more particular description of the disclosure, briefly summarized above, may be had by reference to embodiments, some of which are illustrated in the appended drawings. It is to be noted, however, that the appended drawings illustrate only exemplary embodiments and are therefore not to be considered limiting of its scope, may admit to other equally effective embodiments.
To facilitate understanding, identical reference numerals have been used, where possible, to designate identical elements that are common to the figures. It is contemplated that elements and features of one embodiment may be beneficially incorporated in other embodiments without further recitation.
Aspects of the current disclosure relate to an environment capable of selectively permitting physical access to a secure region defined therein. A number of different visual sensor devices are employed within the environment and networked to a controller to monitor a person's interactions with an access control device, through which the person may be permitted to physically access the secure region.
Using a system having one or more visual sensors within the environment, a retailer or other provider may compile and process acquired image data to selectively permit a person to physically access the secure region. The system may predict, or infer, the person's intent to access the secured region based on any number of visually observable actions, such as the person walking to a designated secure area within the environment and/or interacting with an access control device. Additionally, or alternatively, the system may attempt to determine the person's intent directly, such as by sending a notification to an electronic device associated with the person, and so forth. The system may be further configured to manage the inventory of items within the environment.
Using the system to streamline access to secure areas tends to reduce the amount of time spent by persons (e.g., customers, employees, and/or vendors) involved with accessing items included within the secure area. The system may further determine with increased accuracy which secured items are accessed, when compared with conventional access means. The reduced time and/or improved accuracy tends to result in greater customer satisfaction and a more efficient use of employees. For example, by allowing access to secured items in real-time during a customer's transaction within the environment, a retailer may reduce, or completely eliminate, the need to use employees to physically allow customers access to the secured items, as the customer may simply interact with a designated access control device within the environment. Using the visual sensor based system, a retailer can compile valuable data on the types of items being accessed, the types of customers who are accessing the secured items, and the employees accessing the secured items. While generally discussed within the context of a shopping environment, such as a retail store or other commercial environment, it is contemplated that the techniques disclosed herein may be applied to other environments (some non-limiting examples include libraries, museums, classrooms, hospitals, etc.) to provide a similar experience for persons included therein.
The environment 100 includes a plurality of visual sensors 102 disposed throughout the environment 100. Each of the visual sensors 102 includes one or more types of visual sensors such as visible light sensors (e.g., image or video cameras), infrared (IR) sensors, and so forth. Generally, the visual sensors 102 may be disposed at any suitable location within the environment 100. Some non-limiting examples of locations include below, within, or above a floor of the environment 100, and within other structural components of the environment 100 such as walls or ceiling structures. In some embodiments, the visual sensors 102 may be disposed on, within, or near item display areas such as a display unit 106. As will be explained in greater detail below, the visual sensors 102 are used to identify and monitor a plurality of persons within the environment 100 and interactions between the plurality of persons, the purchasable items, the access control devices, and the secure region 112. Although not shown, one or more additional types of sensors may be disposed within the environment 100, such as audio sensors (e.g., microphones), pressure sensors, and so forth. In some cases, the additional sensors may be collocated with the visual sensors 102. Further, other devices may be used with the visual sensors 102 such as speakers, displays, transmitters, and so forth.
In an exemplary embodiment, at least some of the visual sensors 102 are oriented towards doorways 104 or alternate areas of ingress and/or egress in order to identify customers entering and/or exiting the environment 100, as will be explained in further detail in
Turning to
The controller 201 communicates with the visual sensors 102 via the I/O 204. For example, the controller 201 may receive image data from the visual sensors 102 for processing. The controller 201 is further connected through the I/O 204 to a network 206 to communicate with a plurality of computing devices, some of which may be disposed outside the environment 100. Network 206 may include one or more networks of various types, including a local area or local access network (LAN), a general wide area network (WAN), and/or a public network (e.g., the Internet). In one exemplary embodiment, the controller 201 communicates via network 206 with customer devices 208 to relay information helpful for customers within the environment 100 such as discounted items, coupons, item location information, step-by-step directions for accessing a secured an item, and so forth. The customer devices 208 may be mobile computing devices having any suitable form, such as handheld computing devices (e.g., tablets, smartphones) or worn computing devices (e.g., a head-mounted display, smartwatch). Further, the customer devices 208 may also be computing devices located within environment 100 that allow a customer to interact with the environment 100. For example, customer devices 208 may be a computer terminal (e.g., a kiosk or computer station) that a customer interacts with to facilitate the access of to a secured item or request assistance from an employee of environment 100.
The controller 201 is further configured to communicate with employee devices 210, which may also include mobile computing devices and/or stationary computing devices (e.g., terminals, desktop computers). In some embodiments, the controller 201 communicates notifications to employee devices 210 indicating various conditions occurring within the environment such as an item running out of stock, a spill in an aisle, a customer needing assistance, a person accessing a secure region, and so forth. The controller 201 may also facilitate communication between multiple networked devices such as the customer devices 208 and/or the employee devices 210. For example, the controller 201 may receive information from a customer device 208 indicating that a customer needs help accessing a secured item. The controller 201 may then communicate to an employee device 210 the location of customer device 208, as well as the problem to be solved.
The controller 201 may also communicate with one or more corporate devices 212 to facilitate ordering of items, sending customer data for analysis, tracking secured items, and so forth. The corporate devices 212 are computing devices having any suitable form, such as handheld computing devices (e.g., tablets, smartphones), worn computing devices (e.g., an optical head-mounted display, smartwatch), or stationary computing devices (e.g., terminals, desktop computers). The controller 201 may communicate with third party devices 214, such as computing devices associated with vendors, to order items, communicate status of shipments, communicate problems with items, etc. The third party devices 214 are computing devices having any suitable form, such as handheld computing devices (e.g., tablets, smartphones), worn computing devices (e.g., an optical head-mounted display), or stationary computing devices (e.g., terminals, desktop computers).
Memory 216 may include a variety of computer-readable media selected for their size, relative performance, or other capabilities: volatile and/or non-volatile media, removable and/or non-removable media, etc. Memory 216 may include cache, random access memory (RAM), storage, etc. Storage included as part of memory 216 may provide a non-volatile memory for the networked computing devices (e.g., employee devices 210, corporate devices 212), and may include one or more different storage elements such as Flash memory, a hard disk drive, a solid state drive, an optical storage device, and/or a magnetic storage device. Memory 216 may be included in a single computing device or may represent an aggregation of memory included in a plurality of devices.
For simplicity, the following paragraphs reference specific functional modules and data in explaining the functionality of controller 201, but are not intended to limit the embodiments of the present disclosure. The memory 216 stores a plurality of modules generally comprising program code, which when executed by the processor 202, perform a variety of functions for administering the environment 100. It is to be understood that each module and data are generally capable of communicating with all other modules and data within memory 216, such as through transmitted and received signals and/or shared portions of memory 216. The controller 201 encompasses the functionality of all modules and data described in the following paragraphs.
An image processing module 220 receives image information 236 from visual sensors 102, and performs image processing on the image information 236 according to known techniques. The image information 236 may be in any form suitable for identifying persons, items, and behaviors within the environment, such as full-motion video, still frames, etc. Based on the image processing provided by the image processing module 220, an association module 218 determines whether to associate or disassociate multiple persons within a transaction in the environment.
A behavior classification module 222 identifies one or more predefined behaviors 232 by identified persons within the environment 100. The predefined behaviors 232 comprise behaviors associated with interacting with items, which may include direct interactions of the person with one or more items and/or behaviors which generally increase or decrease a likelihood that the person will interact with the item(s). Further, the predefined behaviors 232 may comprise behaviors associated with accessing secured items, which may include direct interactions with an access control device within the environment 100, moving within environment 100 to subsequently interact with an access control device, and so forth.
Some non-limiting examples of predefined behaviors 232 include moving towards a display unit, adding an item to a display unit, removing an item from a display unit, holding an item, gesturing towards an item and/or a display unit, moving within a designated region, walking within environment 100, and so forth. The predefined behaviors 232 may also contain information on specific motions performed by a person within environment 100 such as reaching toward a handle, knob, or lever of the access control device, a gesture mimicking operating a handle, knob or lever of a door, a gesture mimicking opening an enclosure with dual doors, reaching toward the access control device, waving towards the access control device, performing a predefined hand gesture indicating approval while in proximity of the access control device, and mouthing a predefined word while in proximity of the access control. Further, while some of the aforementioned non-limiting behaviors were described with reference to a display unit, it is to be understood any of the behaviors may be directed towards a display unit and/or an access control device 100. The transaction manager module 228 uses data representing the identified predefined behaviors 232 to add/remove items from a person's virtual transaction data 242.
Item identifying module 224 determines the identity of an item within the environment 100. Item identifying module 224 may identify the item based on visually observable properties of a container holding the item, based on properties of the item itself, a combination of the properties of the item inside and outside of its respective container, and so forth. Item identifying module 224 may use attributes of the item and/or container such as size, shape, weight, color, material (e.g., plastic, cloth, leather, etc.), text (e.g., brand name, item name, advertising language, etc.), graphics (e.g., logos, pictures, etc.), identifying codes (e.g., Universal Product Code, Quick Response Code, etc.), identifying tags (e.g., clothing tag, hang tags, etc.), identifying labels (e.g., International Standard Book Number, International Standard Serial Number, etc.), and electronic identifier (e.g., Radio Frequency Identification, magnetic strip, smart cards, etc.).
In one exemplary embodiment, item identifying module 224 compares the item's attributes to attributes within item information data 234 to determine the item's identity. The controller 201 may keep the item information data 234 up-to-date by adding and/or removing items as necessary. In one exemplary embodiment, item information data 234 is updated via network 206 from corporate devices 212 and/or the internet. In another exemplary embodiment, the item identifying module 224 may determine, based upon the identified attributes, that the item is not a part of the item information data 234. In some cases, the item identifying module 224 adds a new entry to the item information data 234 so the item may be identified in the future. Further, the item information data 234 may contain data on the security required for an item, which the controller 201 may update based upon the policies of environment 100.
The person identifying module 226 identifies persons within the environment 100 by identifying physical traits of persons such as gender, height, hair color, eye color, clothing, and so forth. The person identifying module 226 compares the aforementioned traits with known persons data 240 to determine if the identified person is a known person. Further, the person identifying module 226 determines which of a plurality of predetermined classes 238 a person within the environment 100 is a member. Some non-limiting examples of predetermined classes 238 include a customer, an employee, a vendor, etc.
For example, the person identifying module 226 may recognize a person as an employee within predetermined classes 238 having physical traits that match employee “John Doe” in known persons data 240. Thus, person identifying module 226 identifies the person as employee “John Doe,” and controller 201 can now identify “John Doe” throughout the environment 100. In another example, person identifying module 226 may recognize the person as a customer within a particular predetermined class 238, but does not have traits matching the person in known persons data 240. The person identifying module 226 may identify the customer using a place holder such as “Customer 1,” and controller 201 can now identify “Customer 1” throughout the environment 100.
The known persons data 240 may comprise data that was previously provided by persons wanting to be known to environment 100. In one exemplary embodiment, a person may provide various identification information to controller 201 such as image information containing the person's face, hair color, height, weight, age, and any other identifying information. The controller 201 may then store the provided information in known persons data 240 so that controller 201 is capable of identifying the person within environment 100 in the future. In another embodiment, a person wishing to be identified within environment 100 may simply acquiesce through a personal device, or a kiosk located within environment 100, to have various identifying characteristics read by the controller 201 via image analysis. That is, instead of having to manually provide information to controller 201, the controller 201 performs image analysis on one or more images containing the person determine the person's visually identify information. In this manner, the controller 201 is capable of identifying a person within environment 100 and storing the related data in known persons data 240.
Transaction manager module 228 monitors and adaptively updates the various transactions for each of the persons within the environment 100. The transaction manager module 228 uses data from person identifying module 226 to create virtual transaction data 242 for each of the identified persons within environment 100. Virtual transaction data 242 contains dynamically updated information related to transactions between the person and the environment 100, such as items associated with the person, the number of items, and so forth. The transaction manager module 228 uses data representing the identified predefined behaviors 232 to add/remove items from a person's virtual transaction data 242.
Transaction manager module 228 is further configured to monitor and/or store completed virtual transactions for each of the persons in the environment 100. The completed transaction data 244 stores information related to virtual transactions that have been completed (e.g., items purchased by a customer) within the environment 100. The completed transaction data 244 comprises purchase confirmation information, which may include the quantity of the item sold, the date the transaction was completed, the person associated with the transaction, payment method used, whether the item was sold on sale, the condition of the item at time of sale, the store identification, a transaction identification, shipping information, and so forth.
Store inventory manager module 230 monitors the store inventory data 246 within the environment 100. Store inventory data 246 comprises information on the various items within environment 100. Some non-limiting examples include number in stock, number available, number ordered, security required to handle an item, and so forth. Store inventory manager module 230 may use data from behavior classification module 222 to update the store inventory data 246 based on the behavior associated with the identified person. The transaction manager module 228 may notify store inventory manager module 230 that an item has been added/removed from a person's virtual transaction data 242, and the store inventory module 228 updates the store inventory data 246 as necessary. Further, store inventory manager module 230 may be capable of tracking specific items throughout environment 100 such that controller 201 knows the location of all items at any given time. Thus, the store inventory manager module 230 is configured to monitor various items within the environment 100 in real-time.
Security manager module 231 manages the security qualifications of persons and items within environment 100. The security manager module 231 may use information stored within store inventory data 246 and/or item information 234 to determine the security qualification a person is required to have in order to access and/or purchase items within environment 100. For example, the security manager module 231 may use person identifying module 226 to determine the identity of a person within environment 100 such as “Customer Alpha.” After determining the identity of “Customer Alpha,” the security manager module 231 may use the known persons data 240 to determine the security qualification of “Customer Alpha,” which determines the access control devices 110 “Customer Alpha” can unlock. Thus, the security manager module 231 may allow persons to unlock access control devices 110 within environment 100 in real time in order to access secured items, without the assistance of an employee.
Turning to
The controller 201 may take into account various visually observable attributes of the customer 302 when determining the identity of customer 302. Some non-limiting examples include gender, age, height, weight, hair color, eye color, skin tone, and tattoos. The controller 201 may perform identification based on one or more additional attributes, such as a time of day, clothing worn by the customer 302, jewelry, and so forth. For example, the controller 201 determines that the customer 302 is female, approximately 5 feet tall, has long brown hair and blue eyes, is approximately 32 years old, and entered the environment 100 at 6:00 P.M.
In another exemplary embodiment, the controller 201 may perform identification based on the presence of a customer device 208 (
In some embodiments, the determined attributes can be collected anonymously (i.e., without association to any personally identifying information) to protect and respect customer privacy, but this is not a requirement. For example, the controller 201 may use some or all of the determined attributes to identify customer 302 as “Customer A.” After identifying “Customer A,” in some cases the controller 201 deletes some or all of the determined attributes, as the controller 201 may subsequently track the customer 302 as “Customer A.” In this way, the controller 201 need not store all determined information related to customer 302 throughout environment 100 once identified. Rather, a placeholder can be used to track customer 302 and the virtual transaction can be associated with the placeholder.
In another exemplary embodiment, customer 302 has previously provided information to the controller 201 and/or allowed the controller 201 to store information on customer 302. For example, after recognizing the various attributes of the customer 302, the controller 201 may recognize the identified customer 302 as a known person from known persons data 240 (i.e., customer 302 is a loyalty member of, or has previously been within, the environment 100). Thus, the controller 201 may use the visually determined attributes to associate the customer 302 with a specific known person (e.g., “Jane Doe” having loyalty number “0123456”). While
Turning to
In one exemplary embodiment, if the controller 201 determines the person's face is too obscured, the controller 201 may notify an employee via an employee device 210 (
As shown in
As illustrated, the facial visibility score 328 is a score that is based upon how visible different parts of the face are. In one exemplary embodiment, the visibility score 328 is the summation of the determined visibility scores of the person's eyes 314, nose 316, mouth 318, chin 320, forehead 322, ears 324, and hair 326. The aforementioned facial features are scored on a scale of one to one hundred (1-100). Thus, in this example, the visibility score 328 has a maximum possible score of seven hundred (700). The controller 201 may have a predetermined minimum visibility score 328 in order to identify a person, such as a visibility score over six hundred (600) represents that a person's face is sufficiently visible to properly identify said person. Thus, as shown, the visibility score 328 of six hundred thirty-two (632) meets the minimum visibility required to properly identify a person. While the visibility score 328 was described as a summation of various facial features, the visibility score 328 may be determined using any other suitable arithmetic and/or logical functions, such as a weighted sum of parameter values, comparing parameter values with thresholds to produce a binary decision for each, fuzzy logic, and so forth.
In one exemplary embodiment, the controller 201 first determines whether customer 302's face is visible enough using the facial visibility 308 before determining the person's identification calculation. To identify a person within environment 100, the controller 201 may use the person identification 310 calculation as shown in
As shown in
The controller 201 may compare the identification score 346 with a threshold value to determine that a match exists between the unknown person and the known person. For example, an identification score 346 over seventy-five (75) indicates that the unknown person and the known person are the same. Thus, the identification score 346 represents the probability that the unknown person is the known person. That is, in
The controller 201 may perform a facial recognition analysis on one or more images captured from the visual sensors 102 located throughout the environment 100 to assist in identifying a person. Generally, facial recognition uses the unique characteristics of a person's face to identify said person. As shown in
The person identification 310 calculation contains data on the hair 334, height 336, weight 338, and age 340 of the unknown and known persons, which the controller may compare in determining the identification score 346. The controller 201 may perform image analysis on one or more images captured from the visual sensors 102 to determine the aforementioned parameters of an unknown person within environment 100. As illustrated, the controller 201 determined that unknown person customer 302 has brown hair 334, has a height 336 of six feet one inch, a weight 338 of two hundred twenty-five (225) pounds, and an age 340 of thirty-six (36). The controller may then compare these determined characteristics with known persons data 240 (
The controller 201 may also take into account known features 342 of persons in determining the identification score 346. Known features 342 are features of a person that may help identify a person, but do not fall into the aforementioned categories. Some non-limiting examples of known features 342 may include tattoos and body art, scars, medical devices (e.g., a wheel chair, hearing aids, cane, and so forth), and other readily identifiable features. As illustrated in
The controller 201 may also take into account whether a person has a personal device 344 that has previously been registered with environment 100. The personal device 344 may be a customer device 208 (
As mentioned above, the controller 201 compares the parameters of the unknown person listed in person identification 310 to the parameters of the known person listed in person identification 310 to determine the identification score 346. In the illustrated example, the controller 201 determines that customer 302 and known customer 123456 have an identification score 346 of ninety-eight (98) out of a maximum possible score of one hundred (100), which represents a very strong likelihood that customer 302 and known customer 123456 are in fact the same person. The controller 20 may compare the identification score 346 with a threshold value to determine a match exists between the unknown person and the known person. For example, an identification score 346 over seventy-five (75) indicates that the unknown person and the known person are the same. Thus, the identification score 346 represents the probability that the unknown person is the known person. That is, in
In one exemplary embodiment, the controller 201 may then use the information stored in known persons data 240 (
In one exemplary embodiment, the controller 201 may track the status 406 of purchasable items, such as if the item is past its expiration date or is still fresh. Further, the controller 201 may monitor the days until expired 407 of the individual items. For example, the controller 201 may recognize that item B only has five days until expired 407 and may preemptively order more of Item B so as to replace the stock when the status 406 changes from “Fresh” to “Expired.” This may allow the environment 100 to increase customer satisfaction by reducing the time items are out of stock.
While the data structure has been described as a store inventory 402 for simplicity, the controller 201 may use any number and type of inventory data for environment 100. For example, the controller 201 may have a separate inventory for each of the display units 106 (
The controller 201 may also track the number in stock 408 compared to the number of items available 410 within the environment 100. For example, when a customer adds one (1) of item B to their virtual transaction, the number available 410 for item B is reduced by one (1), but the number in stock 408 for item B still has five (5) items in stock 408 as the customer 302 has not yet completed the transaction. In this way, the controller 201 is able to track items as soon as the item is associated with a customer until the customer completes the transaction. The controller 201 may further track a number ordered 412 of particular items. For example, the system may recognize item C has a status 406 of “Expired” so it is no longer available for sale, which is reflected in item C having zero (0) as the number available 410. The controller 201 may order more, twenty (20) as shown, to replenish the stock which is reflected by the number ordered 412.
The controller 201 may also track the security required 413 to be in possession of items within environment 100. As shown, the security required 413 ranges from zero (0), meaning no security requirement, to five (5), meaning the highest security requirement. While a scale of zero to five (0-5) is used for simplicity, is to be understood any range and/or value may be used such as a one hundred (100) point scale, a binary scale, written descriptions (e.g., level one), and so forth. In one exemplary embodiment, an item having a security required 413 of zero (0), such as item A, is accessible by anyone and is not secured behind an access control device 110 (
An exemplary data structure for individual customer information 414 is illustrated in
Associated persons 420 may be persons whom the controller 201 recognizes should be associated with a single virtual transaction, such as members of a family within environment 100 together being on a single transaction. As shown, associated persons 420 includes information for “Name 1,” which is also a known persons 416, and “Customer 3.” “Customer 3” may be a friend of the aforementioned family who is shopping with the family in environment 100. The controller 201 recognizes “Customer 3” should be associated with the family based on interactions between the family and “Customer 3,” even though “Customer 3” is not a known person. In this manner, the controller 201 can recognize associated persons 420 even if they are not known persons 416.
Customer information 414 also contains data on the security qualification 419 of the customer. As shown, the security qualification 419 is on a scale from zero (0) to five (5), with the customer having a security qualification 419 of three (3). While a numeric scale from zero to five (0-5) has been used for simplicity, the security qualification may be any suitable scale and/or logic for evaluating and/or representing a security qualification. Some non-limiting examples include 1-100, binary, position (e.g., employee, manager, customer, etc.), a weighted value based on various parameters, and so forth.
The security qualification 419 may be determined by controller 201 based on certain characteristics of the customer such as age (e.g., old enough to purchase age restricted items), loyalty level (e.g., has shopped at store sufficient number of times to be considered trustworthy), and so forth. The security qualification 419 may also be determined by the environment 100 (e.g., a default security qualification) such that the controller 201 is provided with the determined security qualification of the person.
Further, the security qualification 419 may also be based upon information provided by the customer. For example, an employee of environment 100 may check the identification of the customer to verify the customer is a certain age (e.g., at least 18 years old) and provide this information to controller 201. The controller 201 may then determine the security qualification for the customer based upon the provided information. The controller 201 may use the security qualification 419 of a customer to allow the customer access to secured items via an access control device 110 (
Further, customer information 414 also contains information on different payment types 422. As shown, payment types include a credit card (CC#) as well as a cash balance stored with the environment 100, but may include any suitable alternate payment method, such as a virtual currency. For example, a customer may have associated credit card information with their loyalty number 418 and deposited cash with the environment 100. By using the payment information included in payment types 422, the customer may seamlessly checkout using one of the payment types 422.
For example, as the customer interacts with purchasable items and/or transactional items within the environment 100, the controller 201 may add items to the customer's virtual transaction 424. The customer may then proceed to checkout region 108 as shown in
Customer information 414 also contains information on the virtual transaction 424. The virtual transaction 424 may contain information on items such as item identification 426, quantity 428 of the item, location 430 of the item such as in a cart 304 (
In one exemplary embodiment, upon determining a customer has an item they are not qualified to possess, the controller 201 may notify an employee of environment 100 via an employee device 210 (
The controller 201 may also use a data structure for storing employee information 436. As shown in
The controller 201 may also monitor whether the employee is on shift 442 and on break 444. This can allow the controller 201 to ensure employees are properly clocking in/out as well as receiving breaks. For example, the controller 201 may recognize an employee is entering the environment 100 and may then check whether the employee is supposed to be on shift 442 and clock the employee in. In another example, the controller 201 may recognize an employee has not taken their lunch break as required and send a reminder to an employee device 210 (
Employee information 436 contains data on the security qualification 446 for an employee. As shown, the main difference between the security qualification 419 (
In one exemplary embodiment, the employee illustrated in
Employee information 436 contains a virtual transaction 452. In some embodiments, the virtual transaction 452 maintained for an employee differs in content from the virtual transaction 424 maintained for a customer. Virtual transaction 452 contains information such as an item I.D. 454, quantity of the item 456, whether an item is damaged 458, location of the item 460, and security required 462 to be in possession of the item. The controller 201 may update the virtual transaction 452 of an employee based on interactions with items in environment 100. For example, an employee may be stocking item A within the environment 100 and one (1) of item A is damaged 458. The controller 201 may recognize an item is damaged or the employee may inform the system that an item is damaged. After being informed of the damaged item, the controller 201 will appropriately update the virtual transaction 452 as well as update the store inventory 402 (
The controller 201 may use the security required 462 for an item in conjunction with the security qualification 446 to ensure an employee is not in possession of an item above their security qualification 446. For example, environment 100 may have items which should only be handled by a manager. The controller 201 may recognize that an employee's virtual transaction 452 contains an item with a security required 462 higher than the employee's security qualification 446. The controller 201 may then take remedial actions based on this determination such as notifying a manger on duty within environment 100, informing the employee, and so forth. In this manner, the controller 201 has the ability to ensure that no employee possess an item they should not have access to.
The controller 201 may have a data structure containing vendor information 464. As shown in
The virtual transaction 476 contains item I.D. 478, quantity 480, number ordered 482, location 484, and security required 486 to be in possession of the item. In some embodiments, the virtual transaction 476 for a vendor differs in content from the virtual transaction 424 (customer) and/or virtual transaction 452 (employee). The controller 201 may use the virtual transaction 476 to update the store inventory 402. For example, a vendor may be delivering item D. The vendor may have only brought five (5), the quantity 480, of item D even though the controller 201 had ordered ten as shown by number ordered 482. The controller 201 may also recognize the location 484 of item D as being on a trolley associated with the vendor. Thus, the controller 201 is able to update the store inventory 402 in real time based on the vendor information 464.
Vendor information 464 contains data on the security qualification 470 of the vendor. Similar to the employee security qualification 446, the vendor security qualification 470 takes into account whether the vendor is on shift 472 or off shift 474. While this exemplary embodiment uses on shift vs off shift, it is to be understood that this is not necessary and the controller 201 may use any parameter to determine the vendor's security qualification 470 such as age, time of day, or any other parameter. Further, the controller 201 may use any logic for evaluating and/or representing a security qualification 470 of the vendor. The controller 201 may use the security required 486 for an item in conjunction with the security qualification 470 of the vendor, similar to the employee information 436 and customer information 414, to ensure the vendor is not in possession of an item above their security qualification 470.
Turning to
While specific features of the exemplary access control device 110-A have been described with reference to a display unit, it is to be understood the access control device 110-A can be any type of enclosed container. Some non-limiting examples include a safe, a beverage cooler, and so forth. Further, while a customer is used to explain the exemplary embodiments above, the invention should not be limited to such. For example, the person could be an employee, vendor, or another predefined class of the persons within environment 100.
Once the customer 502 begins moving towards a specific access control device 110, such as access control device 110-A shown, the controller 201 may reorient or otherwise control the visual sensor 102 to better capture the customer's interaction with the access control device 110. For example, as shown in view 500, the controller 201 may recognize that the customer 502 is stepping towards access control device 110-A in order to gain access to one or more secured items. In one exemplary embodiment, the controller 201 has a predetermined distance that indicates a person is moving towards the access control device. That is, the controller 201 recognizes that the customer 502 is within proximity distance 509, which indicates customer 502 is attempting to interact with the access control device 110-A. While a predetermined distance is used for simplicity, the controller 201 may use any method/logic for determining a person is moving towards the access control device. Some non-limiting examples include determining the movement speed and/or direction of the person towards/away from the access control device, fuzzy logic, a weighted calculation, and so forth.
Before allowing customer 502 to access the items stored within access control device 110-A, the controller 201 may first identify customer 502 to determine whether customer 502 has the requisite security qualification to access the secured items. In one exemplary embodiment, the customer 502 was identified upon entering environment 100 as explained with regards to
In one exemplary embodiment, the controller 201 uses the visual sensor 102 to capture image information that includes a behavior of customer 502. The controller 201 may compare the behavior of customer 502 to a plurality of predetermined behaviors to categorize the behavior. Once the controller 201 has categorized the behavior of customer 502, the controller 201 may then compare the determined behavior with a predetermined behavior associated with opening the access control device 110-A. Upon determining the behavior of customer 502 is the same as the behavior associated with opening the access control device 110-A, the controller 201 may unlock access control device 110-A for customer 502 to access the secured items.
For example, the controller 201 may determine that customer 502 is attempting to open the access control device 110-A based on the behavior of customer 502. As discussed above, the controller 201 may recognize a predefined visual access behavior to indicate a person would like to access the secure region through access control device 110-A. That is, the access control device 110-A may have a visual access behavior 511 that is a “thumbs up.” As shown in
While a “thumbs up” has been used as an example of an access behavior, the invention should not be limited to such. Some non-limiting examples of access behaviors include reaching toward a handle, knob or lever of the access control device, performing a predefined gesture mimicking operating a handle, knob or lever of a door while in proximity of the access control device, performing a predefined gesture mimicking opening an enclosure with dual doors while in proximity of the access control device, reaching toward the access control device, waving towards the access control device, performing a predefined hand gesture indicating approval while in proximity of the access control device (e.g., a thumbs up, waving), and mouthing a predefined word while in proximity of the access control device.
In another exemplary embodiment, the controller 201 may determine that the categorized behavior of the person is an emergency behavior indicating the person is in distress and/or needs assistance. In one example, during a robbery occurring within the environment, the customer 502 may represent a supervisor or other employee attempting to access the secure region 112 under threat of force. In such a case, the person may provide a predefined gesture towards the access control device 110-A, which controller 201 identifies as a predefined emergency behavior indicating that the person is in distress or needs assistance. In some embodiments, in response to identifying the predefined emergency behavior, the controller 201 communicates an emergency message to another computing device within, or external to, the environment 100. In one non-limiting example, the emergency message is communicated to a networked computing device associated with a supervisor of the person. In another non-limiting example, the emergency message is communicated over a network with a security service and/or law enforcement. In some cases, the controller 201 may further support the security/law enforcement function by recording interactions with the access control devices 110, operating the visual sensor 102 to focus on the person and/or other persons included within its field of view (e.g., suspects).
In some embodiments, the controller 201 further deactivates the security element 510 responsive to identifying the predefined emergency behavior. In this way, the person successfully accesses items within the access control device 110-A to avoid the appearance of resisting the robbery and/or avoid the appearance of alerting security and/or law enforcement. Communication of the emergency message can be performed with any suitable timing relative to deactivating the security element 510, such as non-overlapping in time (i.e., before or after) or at least partially overlapping in time. In this manner, the controller 201 is capable of taking different actions depending on the determined physical movement of persons within the environment.
Turning to
As shown in
In some embodiments, the controller 201 may identify that the access control device 110-A is not properly secured. For example, the customer 502 may have walked away from the display unit 504 without closing the doors 506. The controller 201 may perform one or more actions in response, such as communicating a message to a computing device associated with the customer 502 and/or an employee of environment 100. In another exemplary embodiment, the controller 201 sends a signal to a mechanical device (e.g., an actuator) attached to the doors 506 of the display unit 504 that causes the mechanical device to close the doors 506. In this manner, the controller 201 is capable of automatically re-securing the items stored in an access control device once a person is finished interacting with the access control device.
Once customer 502 has finished interacting with the access control device 110-A, customer 502 may continue to walk through environment 100 adding items to their virtual transaction. Once customer 502 has completed their interaction with environment 100, the customer 502 may proceed to the checkout region 108 (
While the
Turning to
Once the customer 602 begins moving towards access control device 110-B, the controller 201 may reorient or otherwise control the visual sensor 102 to better capture the customer's interaction with the access control device 110-B. For example, as shown in view 600, the controller 201 may recognize that the customer 602 is stepping towards access control device 110-B in order to gain entry to the secure region. Before allowing customer 602 to access to the secure region via access control device 110-B, the controller 201 may first identify customer 602 to determine whether customer 602 has the requisite security qualification 419 (
After determining that customer 602 has the requisite security qualification to access the secured items, the controller 201 may determine customer 602 is attempting to open the access control device 110-B. The controller 201 may use the visual sensor 102 to capture image information that includes a physical movement by customer 602. The controller 201 may have a predetermined movement which determines whether access control device 110-B should be opened such as reaching for a handle on access control device 110-B, mimicking opening a door, or any other movement that environment 100 determines should allow a person access to the secured items. Upon recognizing the predetermined movement by customer 602, the controller 201 unlocks the access control device 110-B so that customer 502 can enter the secure region. In another exemplary embodiment, in addition to unlocking the doors, the controller 201 automatically opens the doors so that the customer 502 does not have to physically open the doors.
Turning to
Once the customer 602 is within the secure region 112, the controller 201 may use the visual sensors 102 to determine that the customer 602 is interacting with the display unit 106. Accordingly, the controller 201 may add items to customer 602's virtual transaction 424 (
Turning to
In one exemplary embodiment, the controller 201 prevents customer 602 from opening access control device 110-B in order to ensure that customer 606 does not gain access to the secure region 112. The controller 201 may inform customer 602 as to why they are unable to leave the secure region 112. That is, the controller may communicate with customer 602 via a display, a personal device associated with the customer, a speaker, or any other communication device. Similarly, the controller 201 may communicate with customer 606 to request that customer 606 relocate away from the access control device 110-B. Further, the controller 201 may inform an employee of environment 100 of the situation so that the employee can take the appropriate action(s) to ensure customer 606 does not gain access to the secure region 112.
In an alternative embodiment, instead of preventing customer 602 from leaving the secure region 112, the controller 201 may allow customer 602 to leave the secure region while monitoring customer 606 to ensure customer 606 does not attempt to access the secure region 112 while the access control device 110-B is open. That is, the controller 201 may monitor customer 606 to ensure that customer 606 does not sneak into the secure region 112 when the access control device 110-B is open. If customer 606 does manage to sneak into the secure region 112, the controller 201 may take appropriate actions such as notifying an employee of environment 100 of the security breach.
While the foregoing has been described with an access control device 110-B as shown in
In another exemplary embodiment, the controller 201 is capable of determining a person is in possession of a secured item that they should not possess, and takes action(s) to remove the item from the person's possession. For example, customer 602 may provide customer 606 with the previously secured item 610. However, as previously mentioned, customer 606 does not have the required security qualification to handle the secured item 610 because they are too young. Thus, the controller 201 determines based upon the required security level of the secured item 610 and customer 606's security qualification that customer 606 should not possess the secured item 610. The controller 201 may notify an employee of environment 100 that customer 606 is in possession of the secured item 610 so that the employee can take appropriate action. Further, since the controller 201 provides a substantially continuous tracking of the location of the secured item 610 and who is in possession of secured item 610, the controller 201 can identify that customer 602 provided the secured item 610 to customer 606. Thus, the controller 201 may prevent customer 602 from accessing the secure region 112 in the future because of the security breach. For example, the controller 201 may update the known persons data associated with customer 602 to reduce their security qualification. In this manner, the controller 201 is capable of automatically enforcing the security of items throughout the environment, as well as adjusting the security qualification of persons as needed.
Turning to
Method 700 begins at block 702, where at least one visual sensor disposed within the environment acquires first image information including a first person and an access control device. In one embodiment, the first image information is acquired at the time the person enters the environment. In another embodiment, the first image information is acquired when the person interacts with the access control device. For example, at least one visual sensor may be oriented to acquire first image information upon the person approaching the access control device within the environment.
At block 704, the first person is identified using image analysis performed on the first image information, wherein identifying the first person comprises associating the first person with a security level selected from a plurality of predetermined security levels. The identification may include an analysis of the image information to determine physical attributes of the at least one person. In another embodiment, the system may compare the physical attributes of the at least one person to persons previously known to the environment to identify the person and select the predetermined security level.
At block 706, a first behavior of the first person relative to the access control device is identified using the image analysis. In one embodiment, the first behavior is one of a plurality of predetermine behaviors. In another embodiment, the first behavior is a behavior defined as an emergency behavior to signal the first person is in need of assistance.
At block 708, upon determining that the first behavior corresponds to a predefined visual access behavior, and that the security level of the first person satisfies a predetermined threshold security level associated with the access control device, transmitting a first control signal to a security element of the access control device. Transmitting the control signal to the security element deactivates the security element and permits the first person to physically access the secure region through the access control device. In one embodiment, the predetermined visual access behavior is reaching for a handle located on the access control device. Method 700 is completed following block 708.
The descriptions of the various embodiments of the present disclosure have been presented for purposes of illustration, but are not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein was chosen to best explain the principles of the embodiments, the practical application or technical improvement over technologies found in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.
In the preceding, reference is made to embodiments presented in this disclosure. However, the scope of the present disclosure is not limited to specific described embodiments. Instead, any combination of the following features and elements, whether related to different embodiments or not, is contemplated to implement and practice contemplated embodiments. Furthermore, although embodiments disclosed herein may achieve advantages over other possible solutions or over the prior art, whether or not a particular advantage is achieved by a given embodiment is not limiting of the scope of the present disclosure. Thus, the following aspects, features, embodiments and advantages are merely illustrative and are not considered elements or limitations of the appended claims except where explicitly recited in a claim(s). Likewise, reference to “the invention” shall not be construed as a generalization of any inventive subject matter disclosed herein and shall not be considered to be an element or limitation of the appended claims except where explicitly recited in a claim(s).
Aspects of the present disclosure may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module,” or “system.”
The present disclosure may be a system, a method, and/or a computer program product. The computer program product may include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the present disclosure.
The computer readable storage medium can be a tangible device that can retain and store instructions for use by an instruction execution device. The computer readable storage medium may be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. A non-exhaustive list of more specific examples of the computer readable storage medium includes the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random access memory (SRAM), a portable compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a memory stick, a floppy disk, a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon, and any suitable combination of the foregoing. A computer readable storage medium, as used herein, is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.
Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network and/or a wireless network. The network may comprise copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. A network adapter card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium within the respective computing/processing device.
Computer readable program instructions for carrying out operations of the present disclosure may be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C++ or the like, and conventional procedural programming languages, such as the “C” programming language or similar programming languages. The computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider). In some embodiments, electronic circuitry including, for example, programmable logic circuitry, field-programmable gate arrays (FPGA), or programmable logic arrays (PLA) may execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present disclosure.
Aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer readable program instructions.
These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts or carry out combinations of special purpose hardware and computer instructions.
Embodiments of the disclosure may be provided to end users through a cloud computing infrastructure. Cloud computing generally refers to the provision of scalable computing resources as a service over a network. More formally, cloud computing may be defined as a computing capability that provides an abstraction between the computing resource and its underlying technical architecture (e.g., servers, storage, networks), enabling convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released with minimal management effort or service provider interaction. Thus, cloud computing allows a user to access virtual computing resources (e.g., storage, data, applications, and even complete virtualized computing systems) in “the cloud,” without regard for the underlying physical systems (or locations of those systems) used to provide the computing resources.
Typically, cloud computing resources are provided to a user on a pay-per-use basis, where users are charged only for the computing resources actually used (e.g., an amount of storage space consumed by a user or a number of virtualized systems instantiated by the user). A user can access any of the resources that reside in the cloud at any time, and from anywhere across the Internet. In context of the present disclosure, a user may access applications (e.g., a retail store app for a mobile computing device) or related data (e.g., compiled shopping data) available in the cloud. Doing so allows a user to access this information from any computing system attached to a network connected to the cloud (e.g., the Internet).
While the foregoing is directed to embodiments of the present disclosure, other and further embodiments of the disclosure may be devised without departing from the basic scope thereof, and the scope thereof is determined by the claims that follow.
Herring, Dean Frederick, Steiner, David John, Landers, Jr., John David, Wilson, Paul Morton, Wood, Kimberly Ann
Patent | Priority | Assignee | Title |
10380816, | Jan 25 2017 | Toshiba Global Commerce Solutions Holdings Corporation | Accessing a secure region of an environment using visually identified behaviors relative to an access control device |
10755106, | May 09 2018 | Amazon Technologies, Inc. | Pattern recognition for habit engagement, mistake avoidance, and object finding using sensor data |
11405401, | May 21 2020 | ADP, INC | Determining and applying assurance levels for application operations security |
11450186, | May 30 2019 | PANASONIC I-PRO SENSING SOLUTIONS CO., LTD. | Person monitoring system and person monitoring method |
11514740, | May 26 2021 | International Business Machines Corporation | Securing access to restricted areas from visitors |
11961130, | Jul 14 2019 | Walmart Apollo, LLC | System for sale-restricted items management |
Patent | Priority | Assignee | Title |
6707381, | Jun 26 2001 | KEY CONTROL HOLDING, INC , A DELAWARE CORPORATION | Object tracking method and system with object identification and verification |
8941467, | Jun 06 2011 | NEXT LEVEL SECURITY SYSTEMS, INC | High value display case system |
9031961, | Mar 17 2011 | Amazon Technologies, Inc.; Amazon Technologies, Inc | User device with access behavior tracking and favorite passage identifying functionality |
9171415, | Jul 07 2008 | PEACOCK LAW P C | Secure cabinet for dispensing items |
9196136, | Sep 16 2013 | Indyme Solutions, LLC | Device and method for monitoring locking devices |
20050015592, | |||
20080198006, | |||
20100138037, | |||
20110265113, | |||
20120053725, | |||
20120281885, | |||
20130284806, | |||
20150039458, | |||
20150188925, | |||
20160127378, | |||
20160308859, | |||
20170099200, | |||
20170177845, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Jan 16 2017 | LANDERS, JOHN DAVID, JR | Toshiba Global Commerce Solutions Holdings Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 041081 | /0810 | |
Jan 16 2017 | HERRING, DEAN FREDERICK | Toshiba Global Commerce Solutions Holdings Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 041081 | /0810 | |
Jan 16 2017 | WILSON, PAUL MORTON | Toshiba Global Commerce Solutions Holdings Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 041081 | /0810 | |
Jan 16 2017 | WOOD, KIMBERLY ANN | Toshiba Global Commerce Solutions Holdings Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 041081 | /0810 | |
Jan 18 2017 | STEINER, DAVID JOHN | Toshiba Global Commerce Solutions Holdings Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 041081 | /0810 | |
Jan 25 2017 | Toshiba Global Commerce Solutions Holdings Corporation | (assignment on the face of the patent) | / |
Date | Maintenance Fee Events |
Jan 12 2022 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Date | Maintenance Schedule |
Jul 24 2021 | 4 years fee payment window open |
Jan 24 2022 | 6 months grace period start (w surcharge) |
Jul 24 2022 | patent expiry (for year 4) |
Jul 24 2024 | 2 years to revive unintentionally abandoned end. (for year 4) |
Jul 24 2025 | 8 years fee payment window open |
Jan 24 2026 | 6 months grace period start (w surcharge) |
Jul 24 2026 | patent expiry (for year 8) |
Jul 24 2028 | 2 years to revive unintentionally abandoned end. (for year 8) |
Jul 24 2029 | 12 years fee payment window open |
Jan 24 2030 | 6 months grace period start (w surcharge) |
Jul 24 2030 | patent expiry (for year 12) |
Jul 24 2032 | 2 years to revive unintentionally abandoned end. (for year 12) |