Embodiments of the present invention provide an intuitive platform for non-contact instructions regarding resource allocation. In this way, a user may utilize one or more specific gestures which are captured, analyzed, and responded to by the system of the invention in order to initiate or complete one or more resource activities such as a resource transfer, resource transfer split, or other resource action.
|
1. A system for identifying and authorizing non-contact instructions, the system comprising:
a memory device; and
a processing device operatively coupled to the memory device, wherein the processing device is configured to execute computer-readable program code to:
continuously monitor a data capture sensor via a user device;
analyze the data capture sensor data using convolutional neural network modeling in order to identify a specific gesture;
in response to identification of the specific gesture, generate a prompt for display on the user device indicating an option for a split resource activity;
receive an input response via the user device indicating that the user wishes to complete the split resource activity;
map the specific gesture to the split resource activity by initiating a transfer of resources from a resource account of the user to a resource account of one of the user's contacts; and
store user gesture input data in a user configuration associated with a resource account number and the user's identity.
7. A computer program product for identifying and authorizing non-contact instructions, the computer program product comprising at least one non-transitory computer readable medium comprising computer readable instructions, the instructions comprising instructions for:
continuously monitoring a data capture sensor via a user device;
analyzing the data capture sensor data using convolutional neural network modeling in order to identify a specific gesture;
in response to identification of the specific gesture, generating a prompt for display on the user device indicating an option for a split resource activity;
receiving an input response via the user device indicating that the user wishes to complete the split resource activity;
mapping the specific gesture to the split resource activity by initiating a transfer of resources from a resource account of the user to a resource account of one of the user's contacts; and
storing user gesture input data in a user configuration associated with a resource account number and the user's identity.
13. A computer implemented method for a identifying and authorizing non-contact instructions, said computer implemented method comprising:
providing a computing system comprising a computer processing device and a non-transitory computer readable medium, where the computer readable medium comprises configured computer program instruction code, such that when said instruction code is operated by said computer processing device, said computer processing device performs the following operations:
continuously monitor a data capture sensor via a user device;
analyze the data capture sensor data using convolutional neural network modeling in order to identify a specific gesture;
in response to identification of the specific gesture, generate a prompt for display on the user device indicating an option for a split resource activity;
receive an input response via the user device indicating that the user wishes to complete the split resource activity;
map the specific gesture to the split resource activity by initiating a transfer of resources from a resource account of the user to a resource account of one of the user's contacts; and
store user gesture input data in a user configuration associated with a resource account number and the user's identity.
2. The system of
continuously monitor resource account activity of a resource account of the user;
identify a first merchant with which the user has transacted multiple times;
calculate an average resource value for transactions with the first merchant from the resource account of the user;
identify a later resource activity from the resource account of the user to the first merchant that is above a pre-set threshold, wherein the pre-set threshold is a percentage above the average resource value; and
generate a push notification to the user device.
3. The system of
4. The system of
5. The system of
6. The system of
8. The computer program product of
continuously monitoring resource account activity of a resource account of the user;
identifying a first merchant with which the user has transacted multiple times;
calculating an average resource value for transactions with the first merchant from the resource account of the user;
identifying a later resource activity from the resource account of the user to the first merchant that is above a pre-set threshold, wherein the pre-set threshold is a percentage above the average resource value; and
generating a push notification to the user device.
9. The computer program product of
10. The computer program product of
11. The computer program product of
monitoring a geolocation one or more user devices, cross references the user devices with the user's contacts, and identify one or more specific contacts in proximity to the user at a time coinciding with the later resource activity.
12. The computer program product of
14. The computer implemented method of
continuously monitor resource account activity of a resource account of the user;
identify a first merchant with which the user has transacted multiple times;
calculate an average resource value for transactions with the first merchant from the resource account of the user;
identify a later resource activity from the resource account of the user to the first merchant that is above a pre-set threshold, wherein the pre-set threshold is a percentage above the average resource value; and
generate a push notification to the user device.
15. The computer implemented method of
16. The computer implemented method of
17. The computer implemented method of
|
The present invention generally relates to the field of intuitive solutions for performing actions related to network and resource access.
In conventional systems, a user may utilize a mobile application or web portal to conduct various activities related to one or more resource accounts. With the advent of improved device sensor technology and intelligent neural network analysis, there is an opportunity to provide a more seamless user experience that incorporates a non-contact gesturing approach to initiate or complete certain resource activities.
The following presents a simplified summary of one or more embodiments of the invention in order to provide a basic understanding of such embodiments. This summary is not an extensive overview of all contemplated embodiments, and is intended to neither identify key or critical elements of all embodiments, nor delineate the scope of any or all embodiments. Its sole purpose is to present some concepts of one or more embodiments in a simplified form as a prelude to the more detailed description that is presented later.
Embodiments of the present invention comprise systems, methods, and computer program products that address the above and/or other needs by providing a system to remove the need for a physical interaction between the user and a user device such that the user can gesture at a certain distance from the user device in order to initiate a specific form of resource action. The present invention utilizes various intelligent systems for image, infrared, and video data analysis in order to provide a touchless experience when interacting with a user device to conduct various activities related to user resource accounts. Embodiments of the present invention may comprise a system that intelligently recognizes and responds to user-specific hand gestures captures by one or more device sensors, such as image sensors, infrared sensors, proximity sensors, or the like. Using the present invention, the user may move their hands or fingers in a particular manner to instruct the system to complete a resource activity. For instance, in some embodiments, a user may wish to “split” a resource transfer between multiple parties, and may gesture in a particular manner to instruct the system that the resource transfer should be split. In this way, the user no longer needs to navigate multiple menu systems, buttons, applications, contact lists, or the like, in the course of imitating a split resource transfer. Instead, the user may simply gesture in a scissor-like motion, and the system may intelligently recognize the user's intention of splitting a resource transfer between multiple parties.
For sample, illustrative purposes, system environments will be summarized. Generally, the invention may comprise the steps of: continuously monitor a data capture sensor via a user device; analyze the data capture sensor data using convolutional neural network modeling in order to identify a specific gesture; in response to identification of the specific gesture, generate a prompt for display on the user device indicating an option for a split resource activity; receive an input response via the user device indicating that the user wishes to complete the split resource activity; and map the specific gesture to the split resource activity by initiating a transfer of resources from a resource account of the user to a resource account of one of the user's contacts.
In some embodiments, the invention is further configured to continuously monitor resource account activity of a resource account of the user; identify a first merchant with which the user has transacted multiple times; calculate an average resource value for transactions with the first merchant from the resource account of the user; identify a later resource activity from the resource account of the user to the first merchant that is above a pre-set threshold, wherein the pre-set threshold is a percentage above the average resource value; and generate a push notification to the user device.
In some embodiments, the push notification to the user device further comprises a recommendation or option to initiate a split resource activity.
In some embodiments, the split resource activity comprises a request for payment of resources from one or more of the user's contacts.
In some embodiments, the invention is further configured to monitor a geolocation one or more user devices, cross references the user devices with the user's contacts, and identify one or more specific contacts in proximity to the user at a time coinciding with the later resource activity.
In some embodiments, the one or more data capture sensors further comprise one or more proximity sensors and one or more camera sensors.
In some embodiments, the invention is further configured to store user gesture input data in a user configuration associated with a resource account number and the user's identity.
The features, functions, and advantages that have been discussed may be achieved independently in various embodiments of the present invention or may be combined with yet other embodiments, further details of which can be seen with reference to the following description and drawings.
Having thus described embodiments of the invention in general terms, reference will now be made to the accompanying drawings, wherein:
Embodiments of the present invention will now be described more fully hereinafter with reference to the accompanying drawings, in which some, but not all, embodiments of the invention are shown. Indeed, the invention may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will satisfy applicable legal requirements. Like numbers refer to elements throughout. Where possible, any terms expressed in the singular form herein are meant to also include the plural form and vice versa, unless explicitly stated otherwise. Also, as used herein, the term “a” and/or “an” shall mean “one or more,” even though the phrase “one or more” is also used herein.
In some embodiments, an “entity” or “enterprise” as used herein may be any institution employing information technology resources and particularly technology infrastructure configured for large scale processing of electronic files, electronic technology event data and records, and performing/processing associated technology activities. In some instances, the entity's technology systems comprise multiple technology applications across multiple distributed technology platforms for large scale processing of technology activity files and electronic records. As such, the entity may be any institution, group, association, financial institution, establishment, company, union, authority or the like, employing information technology resources.
As described herein, a “user” is an individual associated with an entity. In some embodiments, a “user” may be an employee (e.g., an associate, a project manager, an IT specialist, a manager, an administrator, an internal operations analyst, or the like) of the entity or enterprises affiliated with the entity, capable of operating the systems described herein. In some embodiments, a “user” may be any individual, entity or system who has a relationship with the entity, such as a customer. In other embodiments, a user may be a system performing one or more tasks described herein.
In the instances where the entity is a financial institution, a user may be an individual or entity with one or more relationships affiliations or accounts with the entity (for example, a financial institution). In some embodiments, the user may be an entity or financial institution employee (e.g., an underwriter, a project manager, an IT specialist, a manager, an administrator, an internal operations analyst, bank teller or the like) capable of operating the system described herein. In some embodiments, a user may be any individual or entity who has a relationship with a customer of the entity or financial institution. For purposes of this invention, the term “user” and “customer” may be used interchangeably. A “technology resource” or “account” may be the relationship that the user has with the entity. Examples of technology resources include a deposit account, such as a transactional account (e.g. a banking account), a savings account, an investment account, a money market account, a time deposit, a demand deposit, a pre-paid account, a credit account, or the like. The technology resource is typically associated with and/or maintained by an entity.
It is understood that “user devices,” such as user device(s) 140, may represent various forms of electronic devices, including user input devices such as personal digital assistants, cellular telephones, smartphones, laptops, desktops, webcams, microphones, scanners, printers, projectors, speakers, CD/DVD-drives, and/or the like, merchant input devices such as point-of-sale (POS) devices, electronic payment kiosks, and/or the like, resource dispensing devices (e.g., automated teller machine (ATM)), and/or edge devices such as routers, routing switches, integrated access devices (IAD), and/or the like. In the case of user device 140 representing an ATM, the device may contain specialized equipment such as video sensors, camera sensors, proximity sensors, infrared sensors, or the like, which enable the user 102 to enter information in a touchless manner.
As used herein, a “user interface” or “UI” may be an interface for user-machine interaction. In some embodiments, such as in the case of a user interaction with an ATM, the user interface may allow for interaction without physical touch of the user device, such as gesturing. In some embodiments the user interface comprises a graphical user interface. Typically, a graphical user interface (GUI) is a type of interface that allows users to interact with electronic devices such as graphical icons and visual indicators such as secondary notation, as opposed to using only text via the command line. That said, the graphical user interfaces are typically configured for audio, visual and/or textual communication. In some embodiments, the graphical user interface may include both graphical elements and text elements. The graphical user interface is configured to be presented on one or more display devices associated with user devices, entity systems, processing systems and the like. In some embodiments the user interface comprises one or more of an adaptive user interface, a graphical user interface, a kinetic user interface, a tangible user interface, and/or the like, in part or in its entirety. In some embodiments, the GUI may respond intelligently to user gestures via the user devices ability to record or recognize user movements near the device, over a keypad, over a virtually rendered keypad on the GUI, or the like.
The network 101 may be a system specific distributive network receiving and distributing specific network feeds and identifying specific network associated triggers. The network 101 may also be a global area network (GAN), such as the Internet, a wide area network (WAN), a local area network (LAN), or any other type of network or combination of networks. The network 101 may provide for wireline, wireless, or a combination wireline and wireless communication between devices on the network 101.
In some embodiments, the user 102 may be one or more individuals or entities that may either provide images for analysis, recognition and extraction, query the non-contact instructions system 108 for identified attributes, set parameters and metrics for data analysis, and/or receive/utilize centralized database information created and disseminated by the non-contact instructions system 108. As such, in some embodiments, the user 102 may be associated with the entity and/or a financial institution. In other embodiments, the user 102 may be associated with another system or entity, such as third party system 105, which may be granted access to the non-contact instructions system 108 or entity system 106 in some embodiments.
The user device 104 comprises computer-readable instructions 110 and data storage 118 stored in the memory device 116, which in one embodiment includes the computer-readable instructions 110 of a user application 122. In some embodiments, the non-contact instructions system 108 and/or the entity system 106 are configured to cause the processing device 114 to execute the computer readable instructions 110, thereby causing the user device 104 to perform one or more functions described herein, for example, via the user application 122 and the associated user interface.
As further illustrated in
The processing device 148 is operatively coupled to the communication device 146 and the memory device 150. The processing device 148 uses the communication device 146 to communicate with the network 101 and other devices on the network 101, such as, but not limited to the entity system 106, the third party system 105, and the user system 104. As such, the communication device 146 generally comprises a modem, server, or other device for communicating with other devices on the network 101.
As further illustrated in
As such, the processing device 148 is configured to perform some or all of the data processing and event capture, transformation and analysis steps described throughout this disclosure, for example, by executing the computer readable instructions 154. In this regard, the processing device 148 may perform one or more steps singularly and/or transmit control instructions that are configured to the CNN model 156, entity system 106, user device 104, and third party system 105 and/or other systems and applications, to perform one or more steps described throughout this disclosure. Although various data processing steps may be described as being performed by the CNN model 156 and/or its components/applications and the like in some instances herein, it is understood that the processing device 148 is configured to establish operative communication channels with and/or between these modules and applications, and transmit control instructions to them, via the established channels, to cause these module and applications to perform these steps.
Embodiments of the non-contact instructions system 108 may include multiple systems, servers, computers or the like maintained by one or many entities.
In one embodiment of the non-contact instructions system 108, the memory device 150 stores, but is not limited to, the CNN model 156. In one embodiment of the invention, the CNN model 156 may associated with computer-executable program code that instructs the processing device 148 to operate the communication device 146 to perform certain communication functions involving the third party system 105, the user device 104 and/or the entity system 106, as described herein. In one embodiment, the computer-executable program code of an application associated with the CNN model 156 may also instruct the processing device 148 to perform certain logic, data processing, and data storing functions of the application.
The processing device 148 is configured to use the communication device 146 to receive data, such as images, or metadata associated with images, transmit and/or cause display of extracted data and the like. In the embodiment illustrated in
As illustrated in
As further illustrated in
It is understood that the servers, systems, and devices described herein illustrate one embodiment of the invention. It is further understood that one or more of the servers, systems, and devices can be combined in other embodiments and still function in the same or similar way as the embodiments described herein.
As shown in blocks 208 and 210, the system may then either validate user input, or reject user input, respectively, based on the response of the user to the generated gesture prompt. In the event that the user input is rejected, as indicated in block 210, the system may attempt to rectify a possible system error by repeating the gestured entry process in conjunction with a simultaneous recalibration event. The system may generate an additional display prompt for the user gesture, and reset the gesture collection process, as indicated in block 214. Next, as indicated in block 216, the system may repeat data collection via the one or more device sensors of user device 104. As the user completes the gesturing motion, the system may attempt to apply a slightly altered or different CNN model 156 algorithm in order to discern the user gesture input. If the user gesture input is validated, the system may proceed to validate the user input, as shown in block 218. Given that the user gesture input may differ slightly for each user, the system may store unique gesture pattern data for the user in a user configuration as a part of data repository 160.
In some embodiments, when the system rejects user input as indicated in block 210, or if the user has provided a gesture in error, the system may allow the user to gesture in a specific manner in order to restart the gesture input process. For instance, when generating the display of user prompt for user gesture input, the system may also generate text or animations on the display to indicate that waving the user's hand over the screen of the user device 104 will cause the gesture input process to reset and start from the beginning of the data capture process. For instance, if the system detects that the user waves their hand past a data sensor of the user device, the system may generate a display to indicate that the previous gesture input data has been cleared. In this way, the user may conveniently start the process of gesturing over again without the need to navigate a menu system, repeatedly press a backspace button, “clear” button, or the like. Given that users may not be familiar with the user gesture input process initially, or that the system may not have calibrated for the specific user yet, this allows the user an intuitive way to start the process again if user gesture entry is not accurate on the first attempt.
In other embodiments, the user may receive a split payment, or split resource activity, request in the form of a push notification generated by the system, as indicated in block 306. For instance, even at times when the user is not interacting with the entity application 144, the system may be running processes in the background in order to monitor situations where a split resource activity may be required. For instance, if the user makes a purchase using a resource instrument connected to their resource account managed by the entity, the system may recognize that that transaction has occurred, and may pre-emptively push a notification to the user in order to prompt the user to initiate a split resource activity request. For instance, the system may continuously monitor the user's resource account history in order to identify a transaction, resource activity, or the like, that has just occurred. In some embodiments, the resource activity may be a transaction at a restaurant, sporting event, or the like, in which the user spends a certain resource amount. If the resource amount exceeds a certain category-based threshold, or exceeds an average amount spent at a given merchant or location, the system may generate a prediction that the resource activity may have involved more than one party. For instance, if the user's resource activity history on their resource account shows a transaction history for $10, $12, and $8 at a specific coffee shop, the system may calculate a running average of these transaction values over time and store it in the user configuration associated with the user. In this way, the system may determine that the user typically spends a certain amount at a given coffee shop location; in this instance, an average of about $10 at the particular coffee shop. If a later resource activity is recognized as exceeding the average value by a given percentage amount, for example, in some embodiments, greater than 40% more than the average amount, or the like, then the system may determine that the user has paid for a friend, colleague, family member, or the like. In this way the system may deduce that the user may want to initiate a split resource activity in order to request a certain amount from another user in the form of a peer-to-peer (P2P) payment.
In some embodiments, the system may simultaneously monitor geolocation data via the user device 104. In some embodiments, one of the user's contacts may also maintain a resource account with the entity, and the system may monitor the user's location, and the location of the user's contact. For instance, the user configuration for the user and the user's contact may include the users' phone numbers, device identification numbers, IP addresses, or the like. The system may perform a cross comparison of the user's contacts with other known users in the area that maintain an account with the entity. In this way, the system may recognize, in conjunction with the fact that the user may have paid for a family member, friend, or colleague, that the user is also in close proximity to someone from their contacts. In this way, the system may not only deduce that the user may like to split the resource activity, but may also deduce a probable contact that the user would like to direct the request to.
As shown in blocks 308 and 310, following the user initiated interaction with the entity application 144, or the determination that the user may be interested in a split resource activity request and generating the push notification, the system may generate a display for a finger gesture to initiate a split resource action. For instance, the system may display, via the entity application 144, on the user device 104, a graphic or text indicating that the user may gesture in a “scissor” motion, wherein they extend and move their middle and index finger together and apart, within view of the user device 104 camera sensor, in order to indicate that they would like to initiate a split resource action. Following the user's gesture, the system may generate a communicable link with a backend resource application or payment rail, and initiate the split resource activity on the user's behalf, as shown in block 312. The process then ends as shown in block 314, after the system has initiated the split resource activity.
As shown in block 406, the system may also receive data indicating potential user interest in a split resource activity. As discussed with regard to
As will be appreciated by one of ordinary skill in the art, the present invention may be embodied as an apparatus (including, for example, a system, a machine, a device, a computer program product, and/or the like), as a method (including, for example, a business process, a computer-implemented process, and/or the like), or as any combination of the foregoing. Accordingly, embodiments of the present invention may take the form of an entirely software embodiment (including firmware, resident software, micro-code, and the like), an entirely hardware embodiment, or an embodiment combining software and hardware aspects that may generally be referred to herein as a “system.” Furthermore, embodiments of the present invention may take the form of a computer program product that includes a computer-readable storage medium having computer-executable program code portions stored therein. As used herein, a processor may be “configured to” perform a certain function in a variety of ways, including, for example, by having one or more special-purpose circuits perform the functions by executing one or more computer-executable program code portions embodied in a computer-readable medium, and/or having one or more application-specific circuits perform the function.
It will be understood that any suitable computer-readable medium may be utilized. The computer-readable medium may include, but is not limited to, a non-transitory computer-readable medium, such as a tangible electronic, magnetic, optical, infrared, electromagnetic, and/or semiconductor system, apparatus, and/or device. For example, in some embodiments, the non-transitory computer-readable medium includes a tangible medium such as a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a compact disc read-only memory (CD-ROM), and/or some other tangible optical and/or magnetic storage device. In other embodiments of the present invention, however, the computer-readable medium may be transitory, such as a propagation signal including computer-executable program code portions embodied therein.
It will also be understood that one or more computer-executable program code portions for carrying out the specialized operations of the present invention may be required on the specialized computer include object-oriented, scripted, and/or unscripted programming languages, such as, for example, Java, Perl, Smalltalk, C++, SAS, SQL, Python, Objective C, and/or the like. In some embodiments, the one or more computer-executable program code portions for carrying out operations of embodiments of the present invention are written in conventional procedural programming languages, such as the “C” programming languages and/or similar programming languages. The computer program code may alternatively or additionally be written in one or more multi-paradigm programming languages, such as, for example, F#.
It will further be understood that some embodiments of the present invention are described herein with reference to flowchart illustrations and/or block diagrams of systems, methods, and/or computer program products. It will be understood that each block included in the flowchart illustrations and/or block diagrams, and combinations of blocks included in the flowchart illustrations and/or block diagrams, may be implemented by one or more computer-executable program code portions.
It will also be understood that the one or more computer-executable program code portions may be stored in a transitory or non-transitory computer-readable medium (e.g., a memory, and the like) that can direct a computer and/or other programmable data processing apparatus to function in a particular manner, such that the computer-executable program code portions stored in the computer-readable medium produce an article of manufacture, including instruction mechanisms which implement the steps and/or functions specified in the flowchart(s) and/or block diagram block(s).
The one or more computer-executable program code portions may also be loaded onto a computer and/or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer and/or other programmable apparatus. In some embodiments, this produces a computer-implemented process such that the one or more computer-executable program code portions which execute on the computer and/or other programmable apparatus provide operational steps to implement the steps specified in the flowchart(s) and/or the functions specified in the block diagram block(s). Alternatively, computer-implemented steps may be combined with operator and/or human-implemented steps in order to carry out an embodiment of the present invention.
While certain exemplary embodiments have been described and shown in the accompanying drawings, it is to be understood that such embodiments are merely illustrative of, and not restrictive on, the broad invention, and that this invention not be limited to the specific constructions and arrangements shown and described, since various other changes, combinations, omissions, modifications and substitutions, in addition to those set forth in the above paragraphs, are possible. Those skilled in the art will appreciate that various adaptations and modifications of the just described embodiments can be configured without departing from the scope and spirit of the invention. Therefore, it is to be understood that, within the scope of the appended claims, the invention may be practiced other than as specifically described herein.
Chauhan, Sandeep Kumar, Sharma, Yash, Veeramreddy, Sudarshan, Yadav, Durgesh Singh, Rao, Ravikiran Subramanya, Matury, Suman, Lal, Geetika, Gaddam, Ramarao, Gajula, Anil, Magham, Koteswara Rao Venkata, Miryala, Santosh Kumar
Patent | Priority | Assignee | Title |
Patent | Priority | Assignee | Title |
10231274, | Jun 20 2014 | Visa International Service Association | Midrange contactless transactions |
10535047, | Nov 19 2015 | WELLS FARGO BANK, N A | Systems and methods for financial operations performed at a contactless ATM |
10692076, | Nov 21 2012 | Visa International Service Association | Device pairing via trusted intermediary |
10706400, | Nov 19 2015 | WELLS FARGO BANK, N A | Systems and methods for financial operations performed at a contactless ATM |
10902421, | Jul 26 2013 | Visa International Service Association | Provisioning payment credentials to a consumer |
10990955, | May 11 2018 | MasterCard International Incorporated | Method and system for contactless withdrawal from an ATM |
7284692, | Mar 31 2004 | GLAS AMERICAS LLC, AS THE SUCCESSOR AGENT | ATM with RFID card, note, and check reading capabilities |
7306158, | Jul 10 2001 | Liberty Peak Ventures, LLC | Clear contactless card |
7575166, | Dec 17 2004 | CITIBANK, N A | Automated teller machine |
7584885, | Apr 01 2003 | GLAS AMERICAS LLC, AS THE SUCCESSOR AGENT | Currency dispensing ATM with RFID card reader |
7975910, | Nov 25 1998 | Diebold Nixdorf, Incorporated | Banking system controlled by data bearing records |
8245915, | Mar 31 2004 | GLAS AMERICAS LLC, AS THE SUCCESSOR AGENT | Automated banking machine with noncontact reading of card data |
8630955, | Jun 29 2006 | Felica Networks, Inc. | Financial card system, communications device, authentication terminal, authentication method, and program |
8701990, | Feb 22 2005 | GLAS AMERICAS LLC, AS THE SUCCESSOR AGENT | Providing service to a customer service center |
9022284, | Nov 25 1998 | Diebold Nixdorf, Incorporated | Banking system controlled responsive to data read from data bearing records |
9031579, | Oct 01 2012 | MasterCard International Incorporated | Method and system for providing location services |
9033221, | Mar 31 2004 | GLAS AMERICAS LLC, AS THE SUCCESSOR AGENT | Automated banking machine with noncontact reading of card data |
9038894, | Nov 20 2012 | Cellco Partnership | Payment or other transaction through mobile device using NFC to access a contactless transaction card |
9355531, | Dec 18 2012 | GLAS AMERICAS LLC, AS THE SUCCESSOR AGENT | Deposit management system that employs preregistered deposits |
9368000, | Mar 31 2004 | GLAS AMERICAS LLC, AS THE SUCCESSOR AGENT | Automated banking machine with non-contact reading of card data |
20100114677, | |||
20110282785, | |||
20130271360, | |||
20210110362, | |||
20220051666, | |||
20220076264, | |||
CN110069199, | |||
JP2022110057, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Jun 02 2022 | CHAUHAN, SANDEEP KUMAR | Bank of America Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 060981 | /0536 | |
Jun 02 2022 | GADDAM, RAMARAO | Bank of America Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 060981 | /0536 | |
Jun 03 2022 | VEERAMREDDY, SUDARSHAN | Bank of America Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 060981 | /0536 | |
Jun 03 2022 | RAO, RAVIKIRAN SUBRAMANYA | Bank of America Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 060981 | /0536 | |
Jun 03 2022 | YADAV, DURGESH SINGH | Bank of America Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 060981 | /0536 | |
Jun 06 2022 | MATURY, SUMAN | Bank of America Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 060981 | /0536 | |
Jun 08 2022 | GAJULA, ANIL | Bank of America Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 060981 | /0536 | |
Jul 02 2022 | SHARMA, YASH | Bank of America Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 060981 | /0536 | |
Jul 12 2022 | LAL, GEETIKA | Bank of America Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 060981 | /0536 | |
Jul 14 2022 | MAGHAM, KOTESWARA RAO VENKATA | Bank of America Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 060981 | /0536 | |
Jul 15 2022 | MIRYALA, SANTOSH KUMAR | Bank of America Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 060981 | /0536 | |
Jul 27 2022 | Bank of America Corporation | (assignment on the face of the patent) | / |
Date | Maintenance Fee Events |
Jul 27 2022 | BIG: Entity status set to Undiscounted (note the period is included in the code). |
Date | Maintenance Schedule |
May 14 2027 | 4 years fee payment window open |
Nov 14 2027 | 6 months grace period start (w surcharge) |
May 14 2028 | patent expiry (for year 4) |
May 14 2030 | 2 years to revive unintentionally abandoned end. (for year 4) |
May 14 2031 | 8 years fee payment window open |
Nov 14 2031 | 6 months grace period start (w surcharge) |
May 14 2032 | patent expiry (for year 8) |
May 14 2034 | 2 years to revive unintentionally abandoned end. (for year 8) |
May 14 2035 | 12 years fee payment window open |
Nov 14 2035 | 6 months grace period start (w surcharge) |
May 14 2036 | patent expiry (for year 12) |
May 14 2038 | 2 years to revive unintentionally abandoned end. (for year 12) |