translating gestures made by one avatar to a second avatar in a virtual world by receiving an input from a first user representing an input gesture to be made by the first avatar to the second avatar. The input gesture is translated to generate at least one translated gesture for display. The translated gesture may be output for display as being made by the first avatar to the second avatar.
|
1. A method for translating gestures in a virtual world, comprising:
receiving an input gesture to be made by a first avatar directed to a second avatar in the virtual world;
displaying the input gesture on a first display;
translating, using a processor, the input gesture based at least in part on one or more environmental, cultural or social factors to generate at least one translated gesture for display; and
displaying the at least one translated gesture on a second display separate from the first display, wherein the at least one translated gesture is different than the input gesture.
21. An apparatus for translating gestures in a virtual world, comprising:
a processor capable of:
receiving an input gesture to be made by a first avatar directed to a second avatar in the virtual world;
displaying the input gesture on a first display;
translating the input gesture based at least in part on one or more environmental, cultural or social factors to generate at least one translated gesture for display; and
displaying the at least one translated gesture on a second display separate from the first display, wherein the at least one translated gesture is different than the input gesture.
11. A method for translating gestures in a virtual world, comprising:
receiving an input from a first user representing an input gesture to be made by a first avatar to a second avatar in the virtual world;
translating, using a processor, the input gesture input, by the first user, based at least in part on one or more environmental, cultural or social factors to generate at least one translated gesture;
outputting on a first display a depiction of the gesture input by the first user as being made by the first avatar to the second avatar; and
outputting on a second display the translated gesture as being made by the first avatar to the second avatar, wherein the first display is separate from the second display.
16. A computer program product for translating gestures in a virtual world, comprising:
a computer readable medium having computer readable program code embodied therein, the computer readable medium comprising:
computer readable program code configured to receive an input gesture to be made by a first avatar directed to a second avatar in the virtual world;
computer readable program code configured to display the input gesture on a first display;
computer readable program code configured to translate the input gesture based at least in part on one or more environmental, cultural or social factors to generate at least one translated gesture for display; and
computer readable program code configured to display the at least one translated gesture on a second display separate from the first display, wherein the at least one translated gesture is different than the input gesture.
2. The method of
3. The method of
4. The method of
5. The method of
6. The method of
7. The method of
8. The method of
9. The method of
10. The method of
12. The method of
13. The method of
14. The method of
15. The method of
17. The computer program product of
18. The computer program product of
19. The computer program product of
20. The computer program product of
|
The invention relates to simulations, virtual world simulations of the real-world or real-life or a virtual world and the like, and more particularly to a system and method for translating gestures transmitted between users to conform to any cultural or selected characteristics associated with the users receiving the gestures in a virtual world.
Computer based simulations are becoming more ubiquitous. Simulations may be used for training purposes, for entertainment or for other purposes. Computer simulations such as Second Life™ or similar simulations present a virtual world which allows users or players to be represented by characters known as avatars. Second Life is a trademark of Linden Research, Inc. in the United States, other countries or both. Second Life is an Internet-based virtual world launched in 2003 by Linden Research, Inc. A downloadable client program called the Second Life Viewer enables users, called “Residents”, to interact with others in the virtual world through motional avatars. The virtual world basically simulates the real world or environment. The users or residents via their avatar can explore the virtual world, meet other users or residents, socialize, participate in individual and group activities, create and trade items (virtual property) and services from one another.
Virtual worlds are filled with users from many different geographic locations, different cultures, and different ethnic groups. Further, virtual worlds provide many different environments for users to interact. The diversity in users raises communications issues. A gesture presented one way in one culture may not be understood or worse, could be offensive to a user of another culture. Further, courtesy can typically be shown by altering the gesture to fit the culture of the user with whom you are interacting. Additionally, gestures may be different for different environments. For example, in a business environment a “hello” coupled with a handshake may be an appropriate gesture, whereas in a more casual environment, a “hello” coupled with a simple nod will suffice.
In accordance with an aspect of the invention, a method for translating gestures in a virtual world may comprise receiving an input from a first user representing an input gesture to be made by a first avatar to a second avatar in the virtual world. The method may also include translating the input gesture input by the first user to generate at least one translated gesture for display.
In accordance with another aspect of the invention, a method for translating gestures in a virtual world may comprise receiving an input from a first user representing an input gesture to be made by a first avatar to a second avatar. The method may further comprise translating the input gesture to generate at least one translated gesture based on a set of translation gestures defined for use when communicating with the second avatar.
In accordance with another aspect of the invention, a method for translating gestures in a virtual world may comprise receiving an input from a first user representing an input gesture to be made by a first avatar to a second avatar in the virtual world. The method may further comprise translating the input gesture input by the first user to generate at least one translated gesture. The method may yet additionally comprise outputting for display a depiction of the gesture input by the first user as being made by the first avatar to the second avatar and outputting for display the translated gesture as being made by the first avatar to the second avatar.
In accordance with another aspect of the invention a computer program product for translating gestures in a virtual world comprises a computer readable medium having computer readable program code embodied. The computer readable medium may comprise computer readable program code configured to receive an input gesture to be made by a first avatar directed to a second avatar in the virtual world and computer readable program code configured to translate the input gesture to generate at least one translated gesture for display.
Other aspects and features of the invention, as defined solely by the claims, will become apparent to those ordinarily skilled in the art upon review of the following non-limited detailed description of the invention in conjunction with the accompanying figures.
The following detailed description of embodiments refers to the accompanying drawings, which illustrate specific embodiments of the invention. Other embodiments having different structures and operations do not depart from the scope of the invention.
As will be appreciated by one of skill in the art, the invention may be embodied as a method, system, or computer program product. Accordingly, the invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module” or “system.” Furthermore, the invention may take the form of a computer program product on a computer-usable storage medium having computer-usable program code embodied in the medium.
Any suitable computer usable or computer readable medium may be utilized. The computer-usable or computer-readable medium may be, for example but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection having one or more wires, a tangible medium such as a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), or other tangible optical or magnetic storage device; or transmission media such as those supporting the Internet or an intranet. Note that the computer-usable or computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via, for instance, optical scanning of the paper or other medium, then compiled, interpreted, or otherwise processed in a suitable manner, if necessary, and then stored in a computer memory. In the context of this document, a computer-usable or computer-readable medium may be any medium that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. The computer-usable medium may include a propagated data signal with the computer-usable program code embodied therewith, either in baseband or as part of a carrier wave. The computer usable program code may be transmitted using any appropriate medium, including but not limited to the Internet, wireline, optical fiber cable, radio frequency (RF) or other means.
Computer program code for carrying out operations of the invention may be written in an object oriented programming language such as Java, Smalltalk, C++ or the like. However, the computer program code for carrying out operations of the invention may also be written in conventional procedural programming languages, such as the “C” programming language or similar programming languages, or in functional programming languages, such as Haskell, Standard Meta Language (SML) or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
Aspects of the invention are described below with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
Aspects of the invention provide systems, methods, and computer program products for translation of gestures sent between user avatars in a virtual world. Specifically, aspects of the invention may involve receipt of a gesture from a first avatar. Some aspects, if needed, translate the gesture to an appropriate gesture for communication to a second avatar. Depending on the embodiment, the gesture is translated to comport with proper cultural, environmental, etc. settings. It may also be translated based on the identity of the avatar sending the gesture. Certain aspects of the invention translate gestures so as to more closely resemble appropriate customs and standards of each avatar.
For example, if a first avatar is European and a second avatar is Japanese, a gesture from the European representing a hand shake could be translated to a bow for the second avatar. The gesture input by the first user (shaking hands) is output for display to depict the first and second avatars shaking hands, while the translated gesture is output for display to depict the first and second avatars bowing to each other in a culturally appropriate manner.
As another example, translations of a gesture may be tailored based on a given environment in which the gesture is sent. A different translated gesture could be used based on whether the environment is formal, such as the work place, or informal, such as a social event.
As another example, translation of a gesture may be dependent on the avatar sending the gesture. The translated gesture may be different if the avatar is a friend, as opposed to a mere acquaintance.
As illustrated, the virtual world system 18 is typically implemented as a server/processor 22 accessible via the network 20. This however, could include peer-to-peer type network configurations without a central host. (
In one embodiment, the database 26 of the virtual world system 18 comprises data records 28 for each user and each avatar for each user, if the user has more than one avatar. While various pieces of information may be stored for each user and each user's avatar, for purposes of this invention, at least data are stored for each user regarding any cultural, ethnic, and/or social information associated with each avatar of the user that would dictate translation of gestures sent to the avatar.
The system 18 may further include a translation module 30 for translation of profiled gesture responses in a virtual world. In some embodiments, the system 18 further includes a proximity module 32 used to determine the proximity of avatars relative to each other in the virtual world.
Associated with the translation module 30 is a list of various gestures that may be transmitted by avatars in the virtual world. For each given gesture, there is stored one or more translation gestures that correspond to the given gesture, but are based on particular environmental, cultural, ethnicity, etc. factors or factors associated with the avatar sending the gesture. The gestures and their associated translations may be implemented in many different ways.
In some embodiments, the gestures and associated translations are stored in the database 26 accessible by the translation module 30. In this embodiment, the gestures and translations may be stored in a simple file or may be stored in a relational database. In some embodiments, the gestures and translations are in the form of a rules engine embedded in the translation module 30.
In some embodiments, the gestures and associated translations are in a general database that is common to all avatars. In other embodiments, the gestures and translations may be stored for each user and each user avatar. In these embodiments, the gestures and translations may be customized for each avatar, such that a user may define how a gesture received by the avatar is to be translated prior to display to the user.
Regardless of where and how the gestures and associated translations are stored, it is understood that the translated gestures may be defined in any number of ways and with any desired level of granularity. For example, several different translations may be stored for a given gesture. There could be different translations for the same gesture based on a social setting, such that a received gesture from a third party avatar is translated for the user based on a social situation in which the gesture was made, such as formal, e.g. business/work, or informal). Where the translations are user defined, different translations may be designated for different third party avatars. For example, translation of a gesture from a mere acquaintance may be different than translation of the gesture received from a friend.
As an example, an avatar that is based on Japanese cultural norms may have associated with it more than one translation of handshake gesture from a third party avatar. For non-designated third party avatars, (e.g., those in which a more formal interaction is presumed), the translation may be a formal bow. However, for certain designated third party avatars, such as friends or acquaintances, the handshake gesture may be translated as a nod or not translated at all and displayed as a hand shake.
In this embodiment, a first user 12 directs an associated first avatar to communicate a gesture to a second avatar associated with a second user 14. (The second avatar could be associated with the first user, as opposed to another user.). The gesture is received by the server/processor 22 of the virtual world system 18. (See block 100). A gesture may include any visual, textual or oral communication or gesture in a virtual world that an avatar may perform or which may be performed for an avatar. Examples of a gesture may include, but are not limited to, extending a hand, bowing, saying “hello”, waving, winking, writing out “hello”, etc.
In this particular embodiment, the first user defines the avatar to which the gesture is directed, i.e., the second avatar. As such, the server/processor 22 also detects the indication that the gesture is directed to the second avatar. (See block 102). The user may direct the gesture in several different ways. For example, in some embodiments, the user may direct the gesture to a plurality of other avatars by designating the avatars in the command. In other embodiments, to be discussed later, the user may direct the gesture to all avatars within a certain proximity of the user's avatar by specifying a desired distance or multi-dimensional environment.
The server/processor 22 next accesses information associated with the second avatar defining any cultural, ethnicity, or other social/cultural characteristics of the avatar. (See block 104). Further, the server/processor 22 provides the gesture and the information regarding the second avatar to the translation module 30. Using the list of various gestures and associated translations 34, the translation module 30 determines an appropriate translation gesture to be sent to the second avatar based on the gesture received from the first user and the information regarding the second avatar. (See block 106).
The server/processor 22 creates proper GUIs and transmits them to the computers 21 associated with the first and second users 12 and 14, respectively for display. (See block 108). The GUIs for the first and second users may be different. The GUI displayed for the first user may depict the gesture in no translated form, while the GUI for the second user depicts the translated gesture.
For example, if the gesture input by the first user is an extended hand for a hand shake, the GUI displayed to the first user will show the first avatar with an extended hand. If the second avatar is characterized as of Japanese origin or following Japanese cultural norms, the translated gesture may be a bow. Therefore, the GUI illustrated to the second user will show the first avatar bowing.
As is understood, the second user may reply to the gesture. In this instance, the second user would command the second avatar to bow in response. This gesture would be transmitted to the first avatar and translated by the virtual world system 18 in a similar manner. As such, the GUI presented to the first user would depict the first and second avatars shaking hands, while the GUI presented to the second user would depict the first and second users bowing to each other.
Either alternatively or in addition to sending the gesture to specific avatars, the first user may send the gesture so as to be received by all users within a given proximity, such as a wave or similar gesture. With reference to
For each second avatar, the server/processor 22 next accesses information associated with the second avatar defining any cultural, ethnicity, or other social/cultural characteristics of the avatar. (See block 204). Further, the server/processor 22 provides the gesture and the information regarding the second avatar to the translation module 30. Using the list of various gestures and associated translations 34, the translation module 30 determines an appropriate translation gesture to be sent to the second avatar based on the gesture received from the first user and the information regarding the second avatar. (See block 206).
The server/processor 22 creates proper GUIs and transmits them to the computers 21 associated with the users associated with the second avatars. The server/processor 22 also creates and transmits a GUI for display at the first user. (See block 208). The GUIs for each of the users may be different. The GUI displayed for the first user may depict the gesture in no translated form, while each of the GUIs for the second users depict the translated gesture in accordance with the translation made for each of their respective avatars.
As mentioned previously, more than one translation may be provided for a given input gesture. Different translations may be used based on the social setting, such as formal or informal, or the particular avatar sending the gesture. For example, a gesture may be translated differently based on the social setting. It may also be translated differently depending on who is sending the gesture. For example, one might expect to receive a more informal gesture from a friend than from a business colleague. As such, for some embodiments, the invention may translate the gesture based on either one or both the environment in which the gesture was made or the particular avatar making the gesture.
For example, with reference to
The server/processor 22 creates proper GUIs and transmits them to the computers 21 associated with the users associated with the second avatars. The server/processor 22 also creates and transmits a GUI for display to the first user. (See block 310). The GUIs for each of the users may be different. The GUI displayed for the first user may depict the gesture in no translated form, while each of the GUIs for the second users depict the translated gesture in accordance with the translation made for each of their respective avatars.
A similar process would be used to tailor the translation based on the individual/avatar sending the gesture. In this embodiment, using the list of various gestures and associated translations 34, the translation module 30 determines an appropriate translation gesture to be sent to the second avatar based on the gesture received from the first user, the information regarding the second avatar, and the information regarding the first avatar that sent the gesture.
As mentioned previously, in some embodiments, the gestures and associated translations are in a general database that is common to all avatars. In other embodiments, the gestures and translations may be stored for each user and each user avatar. In these embodiments, the gestures and translations may be customized for each avatar, such that a user may define how a gesture received by the avatar is to be translated prior to display to the user. Where individualized translation gestures are used, the invention would access these translation files, as opposed to a general database of translations to determine what translated gesture to display.
The first user could use the translation of gestures to translate different gestures he/she transmits without requiring the first user to make such differentiations. In some embodiments, the first user may have a customized set of translations that are used to translate command gestures send by the first user to others.
The above embodiments discuss translation of gestures in general. A gesture is meant to be any visual, textual or oral communication or gesture in a virtual world that an avatar may perform or which may be performed for an avatar. Aspects of the invention may not only be used to translate physical gestures, but also speech or text to different languages. Specifically, by adding a language translation module speech and/or text may be translated.
In the above embodiments, the various modules are described as software that is implemented by the server/processor to perform the various functions described above. In this instance, the various modules comprise computer code instructions for performing the various translation and display steps discussed above. It is understood, however, that various modules may also be a self contained systems with embedded logic, decision making, state based operations and other functions that may operate in conjunction with a virtual world simulation, such as Second Life.
The flowcharts and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
Although specific embodiments have been illustrated and described herein, those of ordinary skill in the art appreciate that any arrangement which is calculated to achieve the same purpose may be substituted for the specific embodiments shown and that the invention has other applications in other environments. This application is intended to cover any adaptations or variations of the invention. The following claims are in no way intended to limit the scope of the invention to the specific embodiments described herein.
Smith, Andrew Bryan, Bokor, Brian Ronald, House, Daniel Edward, Nicol, II, William Bruce, Haggar, Peter Frederick
Patent | Priority | Assignee | Title |
11146661, | Jun 28 2016 | REC ROOM INC | Systems and methods for detecting collaborative virtual gestures |
Patent | Priority | Assignee | Title |
7814041, | Mar 20 2007 | Xenogenic Development Limited Liability Company | System and method for control and training of avatars in an interactive environment |
20040225640, | |||
20060003305, | |||
20070113181, | |||
20080081701, | |||
20090144639, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Apr 03 2008 | HAGGAR, PETER FREDERICK | International Business Machines Corporation | CORRECTIVE ASSIGNMENT TO CORRECT THE SPELLING OF FIRST INVENTOR NAME PREVIOUSLY RECORDED ON REEL 020763 FRAME 0432 ASSIGNOR S HEREBY CONFIRMS THE FIRST INVENTOR NAME SHOULD BE CHANGED FROM BRYAN RONALD BOKOR TO BRIAN RONALD BOKOR | 037040 | /0728 | |
Apr 03 2008 | NICOL, WILLIAM BRUCE, II | International Business Machines Corporation | CORRECTIVE ASSIGNMENT TO CORRECT THE SPELLING OF FIRST INVENTOR NAME PREVIOUSLY RECORDED ON REEL 020763 FRAME 0432 ASSIGNOR S HEREBY CONFIRMS THE FIRST INVENTOR NAME SHOULD BE CHANGED FROM BRYAN RONALD BOKOR TO BRIAN RONALD BOKOR | 037040 | /0728 | |
Apr 03 2008 | HOUSE, DANIEL EDWARD | International Business Machines Corporation | CORRECTIVE ASSIGNMENT TO CORRECT THE SPELLING OF FIRST INVENTOR NAME PREVIOUSLY RECORDED ON REEL 020763 FRAME 0432 ASSIGNOR S HEREBY CONFIRMS THE FIRST INVENTOR NAME SHOULD BE CHANGED FROM BRYAN RONALD BOKOR TO BRIAN RONALD BOKOR | 037040 | /0728 | |
Apr 03 2008 | SMITH, ANDREW BRYAN | International Business Machines Corporation | CORRECTIVE ASSIGNMENT TO CORRECT THE SPELLING OF FIRST INVENTOR NAME PREVIOUSLY RECORDED ON REEL 020763 FRAME 0432 ASSIGNOR S HEREBY CONFIRMS THE FIRST INVENTOR NAME SHOULD BE CHANGED FROM BRYAN RONALD BOKOR TO BRIAN RONALD BOKOR | 037040 | /0728 | |
Apr 03 2008 | BOKOR, BRIAN RONALD | International Business Machines Corporation | CORRECTIVE ASSIGNMENT TO CORRECT THE SPELLING OF FIRST INVENTOR NAME PREVIOUSLY RECORDED ON REEL 020763 FRAME 0432 ASSIGNOR S HEREBY CONFIRMS THE FIRST INVENTOR NAME SHOULD BE CHANGED FROM BRYAN RONALD BOKOR TO BRIAN RONALD BOKOR | 037040 | /0728 | |
Apr 03 2008 | HAGGAR, PETER FREDERICK | International Business Machines Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 020763 | /0432 | |
Apr 03 2008 | NICOL, WILLIAM BRUCE, II | International Business Machines Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 020763 | /0432 | |
Apr 03 2008 | HOUSE, DANIEL EDWARD | International Business Machines Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 020763 | /0432 | |
Apr 03 2008 | SMITH, ANDREW BRYAN | International Business Machines Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 020763 | /0432 | |
Apr 03 2008 | BOKOR, BRYAN RONALD | International Business Machines Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 020763 | /0432 | |
Apr 04 2008 | International Business Machines Corporation | (assignment on the face of the patent) | / |
Date | Maintenance Fee Events |
Apr 17 2019 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Aug 21 2023 | REM: Maintenance Fee Reminder Mailed. |
Dec 11 2023 | M1552: Payment of Maintenance Fee, 8th Year, Large Entity. |
Dec 11 2023 | M1555: 7.5 yr surcharge - late pmt w/in 6 mo, Large Entity. |
Date | Maintenance Schedule |
Dec 29 2018 | 4 years fee payment window open |
Jun 29 2019 | 6 months grace period start (w surcharge) |
Dec 29 2019 | patent expiry (for year 4) |
Dec 29 2021 | 2 years to revive unintentionally abandoned end. (for year 4) |
Dec 29 2022 | 8 years fee payment window open |
Jun 29 2023 | 6 months grace period start (w surcharge) |
Dec 29 2023 | patent expiry (for year 8) |
Dec 29 2025 | 2 years to revive unintentionally abandoned end. (for year 8) |
Dec 29 2026 | 12 years fee payment window open |
Jun 29 2027 | 6 months grace period start (w surcharge) |
Dec 29 2027 | patent expiry (for year 12) |
Dec 29 2029 | 2 years to revive unintentionally abandoned end. (for year 12) |