activity data generated during a day or other time period on one or more computing devices is collected and aggregated. The aggregated data is then presented through an activity review user interface. The activity review user interface can be presented on a large format display device, such as a projector or television. The activity review user interface can also be navigated using natural input methods, such as gesture and voice input.
|
18. An apparatus comprising:
a processor; and
a computer-readable storage media having executable instructions stored thereupon which, when executed by the processor, cause the apparatus to
collect activity data generated by two or more applications executing on two or more computing devices during a day, wherein at least one of the two or more devices is a portable device;
provide the collected activity data to the two or more computing devices; aggregate the collected activity data to generate an activity overview video that summarizes the activity data collected during the day by one or more of chronologically, by event, according to projects or tasks, by people associated with the activity data and a geographical location of the computing devices when the activity data was collected;
provide a user interface for reviewing the aggregated activity data on the two or more computing devices, at least one of the two or more computing devices configured to provide the user interface on a large format display device for viewing the activity overview video, the activity overview video comprises a multimedia file; and to enable the use of gesture input to navigate the activity overview video shown on the large format display device.
1. A computer-implemented method comprising performing computer-implemented operations for:
collecting activity data generated by two or more application programs executing on two or more computing devices during a time period for a user of the two or more computing devices;
aggregating the collected activity data to generate aggregated activity data for the time period,
wherein the aggregated activity data comprises an activity overview video that summarizes the activity data collected during the time period and a user interface to allow the review of the aggregated activity data, wherein the activity overview video comprises a multimedia file and
wherein the aggregated activity data further comprises one or more of calendar data, location data, notes data, message data, audio data, video data, or a document utilized on or generated by the computing device during the time period;
providing the aggregated data to the two or more computing devices, wherein at least one of the two or more devices is a portable device; and
providing the user interface for reviewing the aggregated activity data on the two or more computing devices, at least one of the two or more computing devices configured to provide the user interface on a large format display device.
14. A computer-readable storage medium having computer-executable instructions stored thereupon which, when executed by a computer, cause the computer to:
collect activity data generated on two or more application programs executing on two or more devices during a time period for a user of the two or more computing devices, wherein at least one of the two or more devices is a portable device;
aggregate the collected activity data to generate aggregated activity data for the time period,
wherein the aggregated activity data comprises an activity overview video that summarizes the activity data collected during the time period and a user interface to allow the review of the aggregated activity data, wherein the activity overview video comprises a multimedia file and
wherein the aggregated activity data further comprises one or more of calendar data, location data, notes data, message data, audio data, video data, or a document utilized on or generated by the two or more computing devices during the time period;
provide the aggregated data to the two or more devices; and
provide the user interface for reviewing the aggregated activity data on the two or more devices, at least one of the two or more devices configured to provide the user interface on a large format display device.
2. The computer-implemented method of
3. The computer-implemented method of
4. The computer-implemented method of
5. The computer-implemented method of
6. The computer-implemented method of
7. The computer-implemented method of
8. The computer-implemented method of
9. The computer-implemented method of
10. The computer-implemented method of
11. The computer-implemented method of
12. The computer-implemented method of
13. The computer-implemented method of
15. The computer-readable storage medium of
16. The computer-readable storage medium of
17. The computer-readable storage medium of
19. The apparatus of
20. The apparatus of
|
Information workers commonly rely on a variety of different computing devices during their workday. For instance, it is not uncommon for an information worker to spend a portion of their workday in the office working on a desktop computer, another portion of the workday out of the office working on a laptop computer or a smartphone, and yet another portion of the workday working on a tablet computing device or other type of portable device. These devices are frequently connected through “cloud” services, so information captured on each device may be available on the other devices.
The time spent during the workday by an information worker utilizing each type of computing device might result in different types of data being generated. For instance, in the same workday a worker might utilize a desktop or laptop computer to create documents, send electronic mail messages, and create meetings, and also utilize a smartphone to collect information, such as notes, photos, or video, and to participate in conference calls. As a result, information workers frequently review and generate data during a workday using many different devices and applications. Consequently, it can be difficult for information workers to marshal all of the activities performed and data created during a day and to review all of this data in an efficient fashion.
It is with respect to these and other considerations that the disclosure made herein is presented.
Technologies are described herein for collecting, aggregating, and presenting activity data so that a user can access and utilize use the data they generate throughout the day on a variety of devices and services. Through an implementation of the concepts and technologies disclosed herein, data generated by an information worker during a day or other time period on multiple computing devices can be collected, aggregated, filtered, and presented to the worker for efficient review. This review may be utilized to help the information worker synthesize all information and plan for the next day. The aggregated data can be presented on any one of the computing devices or, alternately, on a large format display device, such as a projector or television, and navigated using natural input methods, such as gesture and voice input. In this manner, an information worker can quickly and efficiently review data generated during a day or other time period and use this information to plan for the next day.
According to one aspect presented herein, activity data generated at one or more computing devices, such as desktop or laptop computers or smartphones, is collected during a period of time, such as a day, week, or month. Activity data is data that describes the activities performed by the user of a computing device during a particular time period, such as a day. Activity data might be generated by one or more programs executing on the computing device during the time period generated in response to user input or passive collection by background services on the device. For instance, activity data might include calendar items, notes, to-do items, electronic mail and other types of messages, audio and video files, and documents. Activity data might also include data generated by an operating system of a computing device, such as location data indicating a geographic location of the device at a particular time. Activity data might be collected from a multitude of applications executing on the same device and from multiple devices used by the same individual.
Once the activity data has been collected, the activity data is aggregated to create aggregated activity data. The aggregated activity data is a collection of all of the activity data for a user from one or more applications executing on one or more devices during a period of time. For instance, the aggregated activity data might include all of the notes, calendar items, meetings, audio and video files, to-do items, and activities performed by an information worker during a day, as well as the locations and times at which they occurred. The activity data might be aggregated on the device upon which the data was generated, another of the user's devices, or transmitted to a server computer for aggregation thereupon.
Once the activity data has been collected and aggregated, the aggregated activity data may be presented to a user by way of a suitable activity review user interface (“UI”). For instance, in one embodiment, a computing device such as a smartphone is configured for connection to a large format display device, such as a television. The computing device is configured to output the activity review UI to the large format display device. The activity review UI includes elements for allowing a user to efficiently review the activity data generated and collected for the relevant time period. In one implementation, an activity overview video is generated and presented that allows the user to quickly review the activities that took place during the relevant time period. A user may be permitted to use natural input mechanisms, like voice and gesture input, to pause and resume playback of the activity overview video.
In one example, a user might choose to review an event, such as a specific meeting. In this example, the displayed content could identify who attended the meeting (and potentially a location for each attendee), notes and documents related to the meeting, pictures taken during the meeting, searches done during the meeting, and potentially other information. All of the information would be presented as an event, and the user would be able to delve into more specific information regarding the event, as necessary.
According to various embodiments, the activity data shown in the activity review UI might be organized chronologically, by event, according to projects or tasks, by people associated with the activity data, or based upon the geographical location of the computing device when the activity data was generated. The activity review UI might also be organized in other ways in other embodiments.
According to various embodiments, the activity review user interface may be navigated using traditional input devices, such as a keyboard or mouse, on a desktop or laptop computer. The activity review user interface might also be navigated using touch input on a smartphone, table device, or other type of computing device. In other embodiments, the user may be permitted to navigate the user interface using natural input mechanisms, such as gesture or voice input. This could be done in a small form factor, or in a larger, optimized, review form factor.
It should be appreciated that this Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended that this Summary be used to limit the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.
The following detailed description is directed to technologies for collecting, aggregating, and presenting activity data. As discussed briefly above, activity data generated during a day or other time period on one or more computing devices is collected and aggregated. The aggregated data is then presented through an activity review user interface. The activity review user interface can be presented on the computing devices or on a large format display device, such as a projector or television. The activity review user interface can also be navigated using traditional input mechanisms, such as keyboard, mouse, and touch, and may also be navigated using natural input methods, such as gesture and voice input. Additional details regarding these and other features will be provided below with regard to
While the subject matter described herein is presented in the general context of program modules that execute in conjunction with the execution of an operating system and application programs on a computer system, those skilled in the art will recognize that other implementations may be performed in combination with other types of program modules. Generally, program modules include routines, programs, components, data structures, and other types of structures that perform particular tasks or implement particular abstract data types. Moreover, those skilled in the art will appreciate that the subject matter described herein may be practiced with other computer system configurations, including hand-held devices, multiprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, mainframe computers, and the like.
In the following detailed description, references are made to the accompanying drawings that form a part hereof, and which are shown by way of illustration specific embodiments or examples. Referring now to the drawings, in which like numerals represent like elements through the several figures, aspects of a computing system and methodology for collecting, aggregating, and presenting activity data will be described.
The routine 100 begins at operation 102, where activity data is collected from applications executing on one or more computing devices. As described briefly above, activity data is data that describes the activities performed by the user of a computing device during a particular time period, such as a day. Activity data might be generated by one or more programs executing on the computing device during the time period generated in response to user input. For instance, activity data might include calendar items, notes, to-do items, electronic mail and other types of messages, audio and video files, and documents. Activity data might also include data generated by an operating system of a computing device, such as location data indicating a geographic location of the device at a particular time, gestures made on the device such as shaking of the device, and information regarding audio collected by the device such as whether user's spoke in a calm or agitated tone. Activity data might be collected from a multitude of applications executing on the same device or from multiple devices used by the same individual. Additional details regarding the collection of activity data from applications and devices will be provided below with regard to
From operation 102, the routine 100 proceeds to operation 104, where the collected activity data is aggregated and filtered. As discussed briefly above, the aggregated activity data is a collection of all of the activity data for a user from one or more applications executing on one or more computing devices during a period of time. For instance, the aggregated activity data might include all of the notes, calendar items, meetings, audio and video files, to-do items, and activities performed by an information worker in one day, week, or month. The activity data might be aggregated on the same device upon which the data was generated or transmitted to a server computer for aggregation thereupon.
Data might also be aggregated according to the “event” at which the data was collected. For instance, as described briefly above, data might be aggregated that identifies all of the people who attended a meeting (and potentially a location for each attendee), notes and documents related to the meeting, pictures taken during the meeting, searches done during the meeting, and potentially other information. As will be described in greater detail below, all of this information can be presented as an event, and a user may be able to delve into more specific information regarding the event, as necessary. Additional details regarding the aggregation of collected activity data will be provided below with regard to
From operation 104, the routine 100 proceeds to operation 106, where the aggregated activity data is organized and presented to a user for review. According to various embodiments, the aggregated activity data may be organized chronologically, by event, according to projects or tasks, by people associated with the activity data, or based upon the geographical location of the computing device when the activity data was generated. The aggregated activity data might also be organized in other ways in other embodiments.
As also described briefly above, the aggregated activity data might also be presented in an appropriate activity review UI at operation 108. As discussed briefly above, a computing device such as a smartphone, is configured in one embodiment for connection to a large format display device, such as a television. The computing device is configured to output the activity review UI to the large format display device. The activity review UI includes elements for allowing a user to efficiently review the activity data generated and collected for the relevant time period. In one implementation, an activity overview video is generated and presented that allows the user to quickly review the activities that took place during the relevant time period. It should be appreciated that the activity review UI might also be presented on any of the devices that collected the data or other devices, such as a desktop, laptop, or table computer.
According to embodiments, various actions can be taken using the activity review UI. For instance, an e-mail message might be transmitted to all of the attendees of a meeting. In another example, an overall task list might be generated and displayed for all of the tasks created during a meeting. Each task might be assigned a higher or lower priority or assigned to another individual for handling. Other types of actions might also be taken through the activity review UI with respect to the aggregated activity data.
The activity review UI may be navigated using traditional input devices, such as a keyboard or mouse. In other embodiments, however, a user may be permitted to navigate the activity review UI using natural input mechanisms, such as gesture or voice input, when hardware supporting such input is available. Additional details regarding the presentation of aggregated activity data and the activity review UI will be provided below with regard to
During any given time period a user might use all or a subset of the computing devices 202A-202D shown in
According to the various embodiments presented herein, each of the computing devices 202A-202D is configured to collect activity data 208A-208D (which may be referred to herein collectively as the “activity data 208”), respectively, generated by a user. As described above, the activity data 208A-208D is data that describes the activities performed by the user of a computing device 202A-202D during a particular time period, such as a day. The activity data 208 might be generated by one or more programs executing on the computing devices 202A-202D during the time period generated in response to user input. For instance, the activity data 208 might include calendar items, notes, to-do items, electronic mail and other types of messages, audio and video files, and documents. The activity data 208 might also include other types of information.
As shown in
The aggregation module 212 is configured to receive the activity data 208 and to aggregate the activity data 208 received from different applications and computing devices to create the aggregated activity data 214. As discussed above, the aggregated activity data 214 is a collection of all of the activity data 208 for a user from one or more applications executing on one or more computing devices 202A-202D during a period of time.
For instance, the aggregated activity data 214 might include all of the notes, calendar items, meetings, audio and video files, to-do items, and other activities reviewed, created, or performed by a user in a certain time period, such as one day, week, or month. As will be describe in greater detail below, the aggregated activity data 214 might be consumed by a program executing on one of the computing devices 202A-202D to present the activity review UI described above. Additional details regarding this process will be provided below with regard to
According to one embodiment, the aggregation module 212 is configured to generate an activity overview video 216 from the aggregated activity data 214. The activity overview video 216 is a multimedia video file that summarizes the activity data 208 collected during a time period. For instance, the activity overview video 216 might include audio and visual information summarizing all of the activities, notes, tasks and other activities performed by a user on any of the computing devices 202A-202D during a certain time period. The computing devices 202A-202D, or another device, might be utilized to view and interact with the activity overview video 216. The activity overview video 216 might also be formatted similarly to the activity review UI shown in
It should be appreciated that the architecture and implementation shown in
For instance, in the example shown in
In the example shown in
In the example shown in
The set-top box might be a video game system, cable box, dedicated set-top box, or another type of component. The voice and/or gesture recognition hardware 406 might also be connected to and utilized with another type of computing device, such as the laptop computing device 202B or the desktop computing device 202C. As will be described in greater detail below, the various types of user input devices and mechanisms disclosed above might be utilized to navigate the activity review UI presented by one of the computing devices 202A-202B for reviewing the aggregated activity data 214. Additional details regarding this process will be provided below with regard to
The user interface 500A shown in
The user interface 500A also includes a tasks portion 504, which provides details regarding the new tasks created during the time period. In the example shown in
In one embodiment, user interface controls corresponding to the events shown in the calendar in the overview portion 502 might be selected in order to obtain additional information regarding a particular selected meeting. For example, if the meeting entitled “New Client Sync Up Meeting” in
The user interface 600 shown in
The user interface 700 shown in
The user interface 800 shown in
It should be appreciated that user interfaces shown in
As discussed above, the user interfaces shown in
In other embodiments, a user might be permitted to view the aggregated activity data 214 on a large format display device 402 and, concurrently, to modify or augment the data using another computing device 202. For instance, a user might be permitted to mark items as tasks, add tasks to a “follow up” list, or mark items as being of high importance. Because the aggregated activity data 214 is stored at the server computer 206 in one embodiment, multiple computing devices 202 might access and utilize this data concurrently in different ways.
The computer architecture shown in
The mass storage device 910 is connected to the CPU 902 through a mass storage controller (not shown) connected to the bus 904. The mass storage device 910 and its associated computer-readable storage media provide non-volatile storage for the computer 900. Although the description of computer-readable media contained herein refers to a mass storage device, such as a hard disk or CD-ROM drive, it should be appreciated by those skilled in the art that computer-readable storage media can be any available computer storage media that can be accessed by the computer 900.
By way of example, and not limitation, computer-readable storage media may include volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data. For example, computer-readable storage media includes, but is not limited to, RAM, ROM, EPROM, EEPROM, flash memory or other solid state memory technology, CD-ROM, digital versatile disks (“DVD”), HD-DVD, BLU-RAY, or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other non-transitory medium which can be used to store the desired information and which can be accessed by the computer 900.
It should be appreciated that the computer-readable media disclosed herein also encompasses communication media. Communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. Combinations of the any of the above should also be included within the scope of computer readable media. Computer-readable storage media does not encompass communication media.
According to various embodiments, the computer 900 may operate in a networked environment using logical connections to remote computers through a network such as the network 920. The computer 900 may connect to the network 920 through a network interface unit 906 connected to the bus 904. It should be appreciated that the network interface unit 906 may also be utilized to connect to other types of networks and remote computer systems. The computer 900 may also include an input/output controller 912 for receiving and processing input from a number of other devices, including a keyboard, mouse, or electronic stylus (not shown in
As mentioned briefly above, a number of program modules and data files may be stored in the mass storage device 910 and RAM 914 of the computer 900, including an operating system 904 suitable for controlling the operation of a networked desktop, laptop, or server computer. The mass storage device 910 and RAM 914 may also store one or more program modules. In particular, the mass storage device 910 and the RAM 914 may store one or more of the software components described above. The mass storage device 910 and RAM 914 may also store other program modules and data.
In general, software applications or modules may, when loaded into the CPU 902 and executed, transform the CPU 902 and the overall computer 900 from a general-purpose computing system into a special-purpose computing system customized to perform the functionality presented herein. The CPU 902 may be constructed from any number of transistors or other discrete circuit elements, which may individually or collectively assume any number of states. More specifically, the CPU 902 may operate as one or more finite-state machines, in response to executable instructions contained within the software or modules. These computer-executable instructions may transform the CPU 902 by specifying how the CPU 902 transitions between states, thereby physically transforming the transistors or other discrete hardware elements constituting the CPU 902.
Encoding the software or modules onto a mass storage device may also transform the physical structure of the mass storage device or associated computer readable storage media. The specific transformation of physical structure may depend on various factors, in different implementations of this description. Examples of such factors may include, but are not limited to: the technology used to implement the computer readable storage media, whether the computer readable storage media are characterized as primary or secondary storage, and the like. For example, if the computer readable storage media is implemented as semiconductor-based memory, the software or modules may transform the physical state of the semiconductor memory, when the software is encoded therein. For example, the software may transform the states of transistors, capacitors, or other discrete circuit elements constituting the semiconductor memory.
As another example, the computer readable storage media may be implemented using magnetic or optical technology. In such implementations, the software or modules may transform the physical state of magnetic or optical media, when the software is encoded therein. These transformations may include altering the magnetic characteristics of particular locations within given magnetic media. These transformations may also include altering the physical features or characteristics of particular locations within given optical media, to change the optical characteristics of those locations. Other transformations of physical media are possible without departing from the scope and spirit of the present description, with the foregoing examples provided only to facilitate this discussion.
Based on the foregoing, it should be appreciated that technologies for collecting, aggregating, and presenting activity data have been disclosed herein. Although the subject matter presented herein has been described in language specific to computer structural features, methodological acts, and computer readable media, it is to be understood that the invention defined in the appended claims is not necessarily limited to the specific features, acts, or media described herein. Rather, the specific features, acts and mediums are disclosed as example forms of implementing the claims.
The subject matter described above is provided by way of illustration only and should not be construed as limiting. Various modifications and changes may be made to the subject matter described herein without following the example embodiments and applications illustrated and described, and without departing from the true spirit and scope of the present invention, which is set forth in the following claims.
Ramanathan, Rajesh, Hoof, Hubert Van, Chisa, Ellen Lizabeth
Patent | Priority | Assignee | Title |
10467230, | Feb 24 2017 | Microsoft Technology Licensing, LLC | Collection and control of user activity information and activity user interface |
10540620, | Oct 31 2016 | Microsoft Technology Licensing, LLC | Personalized aggregated project team activity feed |
10627995, | Mar 29 2017 | Microsoft Technology Licensing, LLC | Collection and control of user activity set data and activity set user interface |
10671245, | Mar 29 2017 | Microsoft Technology Licensing, LLC | Collection and control of user activity set data and activity set user interface |
10693748, | Apr 12 2017 | Microsoft Technology Licensing, LLC | Activity feed service |
10732796, | Mar 29 2017 | Microsoft Technology Licensing, LLC | Control of displayed activity information using navigational mnemonics |
10853220, | Apr 12 2017 | Microsoft Technology Licensing, LLC | Determining user engagement with software applications |
11580088, | Aug 11 2017 | Microsoft Technology Licensing, LLC | Creation, management, and transfer of interaction representation sets |
Patent | Priority | Assignee | Title |
6918089, | Jul 11 2000 | Honda Giken Kogyo Kabushiki Kaisha | Schedule management system |
7333976, | Mar 31 2004 | GOOGLE LLC | Methods and systems for processing contact information |
7788592, | Jan 12 2005 | Microsoft Technology Licensing, LLC | Architecture and engine for time line based visualization of data |
7792868, | Nov 10 2006 | Microsoft Technology Licensing, LLC | Data object linking and browsing tool |
8099679, | Feb 14 2008 | Xerox Corporation | Method and system for traversing digital records with multiple dimensional attributes |
8219932, | Jan 22 2008 | HUAWEI TECHNOLOGIES CO , LTD | Terminal and method for displaying contents information as icons and thumbnail images in a life-diary |
8380866, | Mar 20 2009 | Ricoh Company, Ltd.; Ricoh Company, LTD | Techniques for facilitating annotations |
8707186, | Jun 25 2010 | Avaya Inc. | Conference recap and recording |
20020073059, | |||
20020178206, | |||
20040109674, | |||
20050105374, | |||
20050183143, | |||
20050255842, | |||
20060148528, | |||
20060277467, | |||
20060294396, | |||
20070061247, | |||
20070168354, | |||
20070288247, | |||
20080046218, | |||
20080195664, | |||
20080222613, | |||
20080276179, | |||
20090193360, | |||
20090271720, | |||
20100005485, | |||
20100056340, | |||
20100156812, | |||
20100241691, | |||
20100299615, | |||
20100306233, | |||
20100333020, | |||
20110138366, | |||
20110173525, | |||
20110208787, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
May 25 2011 | CHISA, ELLEN LIZABETH | Microsoft Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 026383 | /0767 | |
May 25 2011 | RAMANATHAN, RAJESH | Microsoft Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 026383 | /0767 | |
May 25 2011 | HOOF, HUBERT VAN | Microsoft Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 026383 | /0767 | |
Jun 03 2011 | Microsoft Technology Licensing, LLC | (assignment on the face of the patent) | / | |||
Oct 14 2014 | Microsoft Corporation | Microsoft Technology Licensing, LLC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 034544 | /0001 |
Date | Maintenance Fee Events |
Jun 06 2016 | ASPN: Payor Number Assigned. |
Jun 06 2016 | RMPN: Payer Number De-assigned. |
Oct 04 2019 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Dec 11 2023 | REM: Maintenance Fee Reminder Mailed. |
May 27 2024 | EXP: Patent Expired for Failure to Pay Maintenance Fees. |
Date | Maintenance Schedule |
Apr 19 2019 | 4 years fee payment window open |
Oct 19 2019 | 6 months grace period start (w surcharge) |
Apr 19 2020 | patent expiry (for year 4) |
Apr 19 2022 | 2 years to revive unintentionally abandoned end. (for year 4) |
Apr 19 2023 | 8 years fee payment window open |
Oct 19 2023 | 6 months grace period start (w surcharge) |
Apr 19 2024 | patent expiry (for year 8) |
Apr 19 2026 | 2 years to revive unintentionally abandoned end. (for year 8) |
Apr 19 2027 | 12 years fee payment window open |
Oct 19 2027 | 6 months grace period start (w surcharge) |
Apr 19 2028 | patent expiry (for year 12) |
Apr 19 2030 | 2 years to revive unintentionally abandoned end. (for year 12) |