A method for animating graphical objects is provided. In one embodiment, the method includes providing a plurality of graphical objects and displaying a subset of the objects in a viewport. In this embodiment, the method may also include calculating a virtual destination for one of the graphical objects based on a received user input, and moving the graphical object to the virtual destination over a period of time. Various additional methods, machine-readable media, and devices for animating graphical objects and controlling operational characteristics of a device are also provided.
|
0. 42. A method, comprising:
performing, by a computing device:
displaying a subset of graphical objects of a plurality of graphical objects, wherein the plurality of graphical objects are associated with respective coordinates;
in response to a user input, determining a virtual position for at least one graphical object of the subset of graphical objects, wherein the virtual position is associated with coordinates corresponding to a location outside of a display area; and
moving the at least one graphical object from its respective coordinates to the coordinates of the virtual position over a pre-determined period of time according to a time dependent function as time elapses during the pre-determined period of time;
in response to another user input, determining another virtual position for the at least one graphical object or another at least one graphical object of the subset of graphical objects, wherein the other virtual position is associated with coordinates corresponding to a location outside of the display area; and
moving the at least one graphical object or the other at least one graphical object from its respective coordinates to the coordinates of the other virtual position over the pre-determined period of time according to the time dependent function as time elapses during the pre-determined period of time, wherein a distance to move the at least one graphical object to the virtual position and a distance to move the at least one graphical object or the other at least one graphical object to the other virtual position are different distances.
16. One or more non-transitory machine-readable storage media having application instructions encoded thereon, the application instructions comprising:
instructions for displaying a portion of a sequence of images;
instructions for calculating a virtual destination for a particular image of the sequence of images based on a user input; and
instructions for animating the sequence of images over a first pre-determined time period according to a time dependent function as time elapses during the first pre-determined time period such that the particular image arrives at its the virtual destination at the end of the first pre-determined time period;
instructions for calculating another virtual destination for the particular image or for another particular image of the sequence of images based on another user input, wherein the other virtual destination is outside of a display of a device displaying the portion of the sequence of images and wherein the other virtual destination is further away from or closer to the particular image or the other particular image in the sequence of images than is the virtual destination in the sequence of images from the particular image; and
instructions for animating the sequence of images over another predetermined time period having a same duration as the first pre-determined time period according to the time dependent function as time elapses during the other pre-determined time period such that the particular image or the other particular image arrives at the other virtual destination at the end of the other pre-determined time period.
0. 41. A non-transitory computer-readable storage medium storing program instructions, wherein the program instructions are computer-executable to implement:
displaying a subset of graphical objects of a plurality of graphical objects, wherein the plurality of graphical objects are associated with respective coordinates;
in response to user input, determining a virtual position for at least one graphical object of the subset of graphical objects, wherein the virtual position is associated with coordinates corresponding to a location outside of a display area;
moving the at least one graphical object from its respective coordinates to the coordinates of the virtual position over a pre-determined period of time according to a time dependent function as time elapses during the pre-determined period of time; and
in response to another user input, determining another virtual position for the at least one graphical object or another at least one graphical object of the subset of graphical objects, wherein the other virtual position is associated with coordinates corresponding to a location outside of the display area;
moving the at least one graphical object or the other at least one graphical object from its respective coordinates to the coordinates of the other virtual position over the pre-determined period of time according to the time dependent function as time elapses during the pre-determined period of time, wherein a distance to move the at least one graphical object to the virtual position and a distance to move the at least one graphical object or the other at least one graphical object to the other virtual position are different distances.
6. A method comprising:
providing a plurality of graphical objects;
displaying a subset of the plurality of graphical objects in a viewport of a display;
receiving a user input;
calculating a virtual destination for at least one graphical object of the subset based on the user input; and
moving the at least one graphical object to the virtual destination over a pre-determined time period according to a time dependent function as time elapses during the pre-determined time period, wherein moving the at least one graphical object comprises moving the at least one graphical object from an actual display position within the viewport to a virtual display position outside of the viewport of the display such that the at least one graphical object arrives at the virtual display position at the end of the pre-determined time period;
receiving another user input;
calculating another virtual destination for another at least one graphical object of the subset based on the other user input, wherein a distance between the at least one graphical object and the virtual destination and another distance between the other at least graphical object and the other virtual destination are different distances;
moving the other at least one graphical object to the other virtual destination over the pre-determined time period according to the time dependent function as time elapses during the pre-determined time period, wherein moving the other at least one graphical object comprises moving the other at least one graphical object from an actual display position within the viewport to the other virtual display position outside of the viewport of the display such that the other at least one graphical object arrives at the other virtual display position at the end of the pre-determined time period.
13. A device comprising:
a housing;
a display disposed in the housing;
a memory device disposed in the housing, the memory device including executable application instructions stored therein; and
a processor disposed in the housing and configured to execute, wherein the application instructions stored in the memory device;
wherein the device is configured to, when executed by the processor, cause the device to: pan through a plurality of images in response to a user input,
wherein to pan through the plurality of images the application instructions when executed by the processor cause the device to:
calculate, based on the user input, a virtual destination outside of the viewport of the display;
move at least one image is moved of the plurality of images over a pre-determined period of time to a the virtual destination that is calculated from the user input, according to a time dependent function as time elapses during the pre-determined period of time; and
the move at least one other image is moved of the plurality of images at least between a virtual display position outside of the display and an actual display position within the display according to the time dependent function as time elapses during the pre-determined period of time, and
wherein to perform another pan through the plurality of images the application instructions when executed by the processor cause the device to:
calculate, based on another user input, another virtual destination outside of the viewport of the display, wherein a distance to the virtual destination and a distance to the other virtual destination are different distances;
move at least one image of the plurality of images over the pre-determined period of time to the other virtual destination according to the time dependent function as time elapses during the pre-determined period of time; and
move at least one other image of the plurality of images at least between a virtual display position outside of the display and an actual display position within the display according to the time dependent function as time elapses during the pre-determined period of time.
1. A method comprising:
defining a plurality of evaluators associated with respective operational characteristics of a device;
defining a control function dependent on time and including first and second evaluators of the plurality of evaluators, wherein at least one of the first or second evaluators includes an additional function dependent on time; and
storing the plurality of evaluators and the control function in a memory of the device, wherein the device is configured to vary the operational characteristics associated with the first and second evaluators vary based at least in part on the control function;
wherein the operational characteristics associated with the first and second evaluators comprise one or more display characteristics of a given graphical object, and the control function comprises a rendering function, and the device is configured to animate;
the method further comprising:
determining a target position for a graphical object outside of a display of the device based on a user input;
animating the graphical object and to vary;
varying the one or more display characteristics associated with the first and second evaluators during animation of the graphical object over a pre-determined period of time based at least in part on the rendering function;
determining another target position for the graphical object or another graphical object outside of the display of the device based on another user input, wherein the target position and the other target position are different distances from a position of the graphical object or the other graphical object;
animating the graphical object or the other graphical object; and
varying the one or more display characteristics associated with the first and second evaluators during animation of the graphical object or the other graphical object over the pre-determined period of time based at least in part on the rendering function,
wherein the rendering function comprises a recursive interpolation function for animating the given graphical object, over the pre-determined period of time, between a source position on the display of the device and a target position outside the display of the device; and wherein the target a current position for the given graphical object being animated between the source position and the target position at a given time within the pre-determined period of time is defined at least in part by one of the first or second evaluators including the additional function dependent on time such that the target current position varies with respect to time, and the given graphical object is animated from the source position to the target position over a the pre-determined period of time such that the given graphical object arrives substantially at the current target position at the end of the pre-determined period of time.
2. The method of
3. The method of
4. The method of
5. The method of
7. The method of
receiving an additional user input while moving the at least one graphical object during the pre-determined time period;
updating the virtual destination for the at least one graphical object based on the additional user input; and
moving the at least one graphical object to the updated virtual destination over an additional pre-determined time period.
8. The method of
9. The method of
0. 11. The method of
12. The method of
calculating virtual destinations for each graphical object of the plurality of graphical objects based on the user input; and
animating the sequence of graphical objects of the plurality of graphical objects within the viewport over the pre-determined time period such that each graphical object of the plurality of graphical objects is moved to its respective virtual destination over at the end of the pre-determined time period.
14. The device of
15. The device of
17. The one or more non-transitory machine-readable storage media of
18. The one or more non-transitory machine-readable storage media of
0. 19. The method of claim 6, further comprising adjusting a plurality of display characteristics of the at least one graphical object or the other at least one graphical object, wherein the display characteristics include position, velocity, acceleration, size, opacity, or color.
0. 20. The method of claim 6, wherein at least some of the subset of graphical objects are displayed at an angle relative to the display.
0. 21. The method of claim 6, wherein the virtual destination or the other virtual destination is calculated in three dimensions.
0. 22. The method of claim 6, wherein said moving the at least one graphical object or said moving the other at least on graphical object is performed about a virtual axis.
0. 23. The method of claim 6, wherein an opacity of the at least one graphical object at the actual display position is different than an opacity of the at least one graphical object at the virtual display position or the other virtual display position; or
an opacity of the other at least one graphical object at the actual display position is different than an opacity of the other at least one graphical object at the other virtual display position.
0. 24. The method of claim 6, wherein the calculated virtual destination includes the virtual display position and the other calculated virtual destination includes the other virtual display position.
0. 25. The device of claim 13, wherein the plurality of images comprises cover art for a media file.
0. 26. The device of claim 13, wherein the application instructions when executed by the processor further cause the device to adjust display characteristics of the at least one image, wherein the display characteristics include position, velocity, acceleration, size, opacity, or color.
0. 27. The device of claim 13, wherein the at least one image or the other at least one image is moved over an additional pre-determined time period to an updated virtual destination that is calculated from an additional user input received during the pan through the plurality of images or the other pan through the plurality of images.
0. 28. The device of claim 13, wherein the user input or the other user input is received as an input to a touch-sensitive user interface.
0. 29. The device of claim 13, wherein at least some of the plurality of images are displayed at an angle relative to the display.
0. 30. The device of claim 13, wherein the virtual destination or the other virtual destination is calculated in three-dimensions.
0. 31. The device of claim 13, wherein the at least one image is moved about a virtual axis.
0. 32. The device of claim 13, wherein an opacity of the at least one image at the actual display position is different than an opacity of the at least one image at the virtual display position or the other virtual display position.
0. 33. The device of claim 13, wherein the calculated virtual destination includes the virtual display position, wherein the virtual display position includes coordinates outside of the display.
0. 34. The one or more non-transitory machine-readable storage media of claim 16, wherein the application instructions further comprise:
instructions for adjusting display characteristics of the particular image or the other particular image, wherein the display characteristics include at least one of position, velocity, acceleration, size, opacity, or color.
0. 35. The one or more non-transitory machine readable storage media of claim 16, wherein the application instructions further comprise:
instructions for updating the virtual destination for the particular image based on an additional user input; and
instructions for animating the sequence of images over a second pre-determined time period such that the particular image arrives at the updated virtual destination at the end of the second pre-determined time period.
0. 36. The one or more non-transitory machine-readable storage media of claim 16, wherein the user input or the other user input is received via a touch-sensitive user interface, and wherein the calculated virtual destination or the other calculated virtual destination depends on a speed of a user movement on the touch-sensitive user interface.
0. 37. The one or more non-transitory machine-readable storage media of claim 16, wherein at least some of the images in the sequence of images are displayed at an angle relative to the display.
0. 38. The one or more non-transitory machine-readable storage media of claim 16, wherein the virtual destination or the other virtual destination is calculated in three dimensions.
0. 39. The one or more non-transitory machine-readable storage media of claim 16, wherein said animating is performed about a virtual axis.
0. 40. The one or more non-transitory machine-readable storage media of claim 16, wherein an opacity of the particular image at its virtual destination is different than an opacity of the particular image at another destination and an opacity of the particular image or the other particular image at its other virtual destination is different than an opacity of the particular image or the other particular image at another destination.
|
1. Technical Field
The present invention relates generally to image processing and, more particularly, to the animation of graphical objects within graphical user interfaces.
2. Description of the Related Art
This section is intended to introduce the reader to various aspects of art that may be related to various aspects of the present invention, which are described and/or claimed below. This discussion is believed to be helpful in providing the reader with background information to facilitate a better understanding of the various aspects of the present invention. Accordingly, it should be understood that these statements are to be read in this light, and not as admissions of prior art.
Electronic devices and systems increasingly include display screens as part of the user interface of the device or system. As may be appreciated, display screens may be employed in a wide array of devices and systems, including desktop computer systems, notebook computers, and handheld computing devices, as well as various consumer products, such as cellular phones and portable media players. Such display screens may be useful for displaying status information about the device or for displaying information about an operation being performed by the device. For example, portable music and/or video players may display information about a music or video file being played by the device, such as the title of the song or video being played, the time remaining, the time elapsed, the artist or cast, or other information of interest. Alternatively, the display of such a device may display a piece of artwork or an arbitrary design during operation of the device.
In some instances, it may be desirable to show an image including one or more graphical objects on the display screen, and to allow a user to pan through a relatively large set of such graphical objects. Further, in some cases, the number of graphical objects may exceed that which may be conveniently displayed at one time. In these cases, the display screen may depict only a subset of the total number of graphical objects, and the particular displayed subset may change as a user pans through the total number of graphical objects. Further, the animation of images and graphical objects may consume a significant amount of memory and processing resources, which may negatively impact performance of the electronic device.
Certain aspects of embodiments disclosed herein by way of example are summarized below. It should be understood that these aspects are presented merely to provide the reader with a brief summary of certain forms an invention disclosed and/or claimed herein might take and that these aspects are not intended to limit the scope of any invention disclosed and/or claimed herein. Indeed, any invention disclosed and/or claimed herein may encompass a variety of aspects that may not be set forth below.
The present disclosure generally relates to techniques for animating graphical objects and images, and for controlling other operational aspects of a device or system. In accordance with one disclosed embodiment, an exemplary method may include controlling and adjusting display characteristics of an image via one or more evaluators. Such display characteristics, in various embodiments, may include position, velocity, acceleration, size, and/or opacity, for example. In some embodiments, multiple evaluators may be incorporated into a rendering function to create complex visual animation effects of one or more graphical objects. In other embodiments, evaluators may be used to control other operational aspects of a device, such as a sound generated by the device. In accordance with another disclosed embodiment, an exemplary method may include displaying a portion of a set of images, and moving one or more of the images to a virtual destination in response to a user input. Further, in some embodiments, the virtual destination may be updated in mid-animation upon the receipt of an additional user input, facilitating a smooth animation of the object from its original position to the updated virtual destination.
Various refinements of the features noted above may exist in relation to various aspects of the present invention. Further features may also be incorporated in these various aspects as well. These refinements and additional features may exist individually or in any combination. For instance, various features discussed below in relation to one or more of the illustrated embodiments may be incorporated into any of the above-described aspects of the present invention alone or in any combination. Again, the brief summary presented above is intended only to familiarize the reader with certain aspects and contexts of embodiments of the present invention without limitation to the claimed subject matter.
These and other features, aspects, and advantages of the present invention will become better understood when the following detailed description of certain exemplary embodiments is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:
One or more specific embodiments of the present invention will be described below. These described embodiments are only exemplary of the present invention. Additionally, in an effort to provide a concise description of these exemplary embodiments, all features of an actual implementation may not be described in the specification. It should be appreciated that in the development of any such actual implementation, as in any engineering or design project, numerous implementation-specific decisions must be made to achieve the developers' specific goals, such as compliance with system-related and business-related constraints, which may vary from one implementation to another. Moreover, it should be appreciated that such a development effort might be complex and time consuming, but would nevertheless be a routine undertaking of design, fabrication, and manufacture for those of ordinary skill having the benefit of this disclosure.
An exemplary electronic device 10 is illustrated in
In certain embodiments, the device 10 may be powered by one or more rechargeable and/or replaceable batteries. Such embodiments may be highly portable, allowing a user to carry the electronic device 10 while traveling, working, exercising, and so forth. In this manner, and depending on the functionalities provided by the electronic device 10, a user may listen to music, play games or video, record video or take pictures, place and receive telephone calls, communicate with others, control other devices (e.g., via remote control and/or Bluetooth functionality), and so forth while moving freely with the device 10. In addition, device 10 may be sized such that it fits relatively easily into a pocket or a hand of the user. While certain embodiments of the present invention are described with respect to a portable electronic device, it should be noted that the presently disclosed techniques may be applicable to a wide array of other, less portable, electronic devices and systems that are configured to render graphical data, such as a desktop computer.
In the presently illustrated embodiment, the exemplary device 10 includes an enclosure or housing 12, a display 14, user input structures 16, and input/output connectors 18. The enclosure 12 may be formed from plastic, metal, composite materials, or other suitable materials, or any combination thereof. The enclosure 12 may protect the interior components of the electronic device 10 from physical damage, and may also shield the interior components from electromagnetic interference (EMI).
The display 14 may be a liquid crystal display (LCD), a light emitting diode (LED) based display, an organic light emitting diode (OLED) based display, or some other suitable display. In accordance with certain embodiments of the present invention, the display 14 may display a user interface and various other images, such as logos, avatars, photos, album art, and the like. Additionally, in one embodiment, the display 14 may include a touch screen through which a user may interact with the user interface. The display may also include various function and/or system indicators to provide feedback to a user, such as power status, call status, memory status, or the like. These indicators may be incorporated into the user interface displayed on the display 14.
In one embodiment, one or more of the user input structures 16 are configured to control the device 10, such as by controlling a mode of operation, an output level, an output type, etc. For instance, the user input structures 16 may include a button to turn the device 10 on or off. Further the user input structures 16 may allow a user to interact with the user interface on the display 14. Embodiments of the portable electronic device 10 may include any number of user input structures 16, including buttons, switches, a control pad, a scroll wheel, or any other suitable input structures. The user input structures 16 may work with the user interface displayed on the device 10 to control functions of the device 10 and/or any interfaces or devices connected to or used by the device 10. For example, the user input structures 16 may allow a user to navigate a displayed user interface or to return such a displayed user interface to a default or home screen.
The exemplary device 10 may also include various input and output ports 18 to allow connection of additional devices. For example, a port 18 may be a headphone jack that provides for the connection of headphones. Additionally, a port 18 may have both input/output capabilities to provide for connection of a headset (e.g., a headphone and microphone combination). Embodiments of the present invention may include any number of input and/or output ports, such as headphone and headset jacks, universal serial bus (USB) ports, IEEE-1394 ports, and AC and/or DC power connectors. Further, the device 10 may use the input and output ports to connect to and send or receive data with any other device, such as other portable electronic devices, personal computers, printers, or the like. For example, in one embodiment, the device 10 may connect to a personal computer via an IEEE-1394 connection to send and receive data files, such as media files.
Additional details of the illustrative device 10 may be better understood through reference to
As discussed further herein, the user interface 20 may be displayed on the display 14, and may provide a means for a user to interact with the electronic device 10. The user interface may be a textual user interface, a graphical user interface (GUI), or any combination thereof, and may include various layers, windows, screens, templates, elements, or other components that may be displayed in all or in part of the display 14. The user interface 20 may, in certain embodiments, allow a user to interface with displayed interface elements via one or more user input structures 16 and/or via a touch sensitive implementation of the display 14. In such embodiments, the user interface provides interactive functionality, allowing a user to select, by touch screen or other input structure, from among options displayed on the display 14. Thus the user can operate the device 10 by appropriate interaction with the user interface 20.
The processor(s) 22 may provide the processing capability required to execute the operating system, programs, user interface 20, and any other functions of the device 10. The processor(s) 22 may include one or more microprocessors, such as one or more “general-purpose” microprocessors, one or more special-purpose microprocessors and/or ASICS, or some combination thereof. For example, the processor 22 may include one or more reduced instruction set (RISC) processors, such as a RISC processor manufactured by Samsung, as well as graphics processors, video processors, and/or related chip sets.
As noted above, embodiments of the electronic device 10 may also include a memory 24. The memory 24 may include a volatile memory, such as random access memory (RAM), and/or a non-volatile memory, such as read-only memory (ROM). The memory 24 may store a variety of information and may be used for various purposes. For example, the memory 24 may store the firmware for the device 10, such as an operating system, other programs that enable various functions of the device 10, user interface functions, processor functions, and may be used for buffering or caching during operation of the device 10.
The non-volatile storage 26 of device 10 of the presently illustrated embodiment may include ROM, flash memory, a hard drive, or any other suitable optical, magnetic, or solid-state storage medium, or a combination thereof. The storage 26 may store data files such as media (e.g., music and video files), software (e.g., for implementing functions on device 10), preference information (e.g., media playback preferences), lifestyle information (e.g., food preferences), exercise information (e.g., information obtained by exercise monitoring equipment), transaction information (e.g., information such as credit card information), wireless connection information (e.g., information that may enable the device 10 to establish a wireless connection, such as a telephone connection), subscription information (e.g., information that maintains a record of podcasts, television shows, or other media to which a user subscribes), telephone information (e.g., telephone numbers), and any other suitable data.
The embodiment illustrated in
The exemplary device 10 depicted in
Further, the device 10 may also include a power source 32. In one embodiment, the power source 32 may be one or more batteries, such as a Li-Ion battery, may be user-removable or secured to the housing 12, and may or may not be rechargeable. Additionally, the power source 32 may include AC power, such as provided by an electrical outlet, and the device 10 may be connected to the power source 32 via the I/O ports 18.
In this embodiment, execution of the rendering function results in the movement of the graphical object 62 from point a to point b along a path 68 over the course of four seconds. Thus, counting in one second intervals (e.g., times t1, t2, t3, and t4), the graphical object 62 may be located at coordinates (2,1), (4,2), (6,3), and (8,4) at respective times t1, t2, t3, and t4.
In another embodiment, the evaluators may be used to render a more complex animation effect, as generally illustrated in the exemplary graph 70 of
R(t)=D(A,B), over t=0 to t=4,
may result in the animation of the graphical object 62 along a path 78, as also generally illustrated in
Further, additional evaluators may be defined and utilized by the rendering function to adjust other display characteristics of the graphical object 62. For example, an evaluator “C” may be defined to control the opacity of the graphical object 62 over a period of time. For instance, the evaluator C may include the function:
f(t)=(4−t)/4, from t=0 to t=4
such that the evaluator returns an opacity value between 0 and 1, inclusive, where an opacity value of 1 corresponds to fully opaque and the opacity value 0 corresponds to fully transparent (i.e., not visible). The rendering function for the graphical object 62 may then be defined as:
R(t)=[D(A,B)],C; from t=0 to t=4,
resulting in the gradual fading of the graphical object 62 as it travels along the path 78, as generally illustrated in
An exemplary method 90 for animating a series of graphical objects or images is depicted in
In one embodiment, the exemplary method 90 includes receiving a user input, such as via a user input structure 16, and calculating a virtual destination for one or more displayed graphical objects based on the user input, as generally indicated in blocks 96 and 98 of
Once the virtual destination is calculated for a graphical object, the graphical object may be animated toward its virtual destination over any desired time period, as generally indicated in block 100. If an additional user input is received during animation of the graphical object toward its virtual destination (block 102), a new, updated virtual destination for the graphical object may be calculated based on the new user input and the graphical object may proceed from its present location to its new virtual destination over a new time period. In one embodiment, the receipt of a new user input during an animation sequence initiated by a previous user input results in the cancellation of the original animation sequence and the beginning of a new animation sequence. Such cancellation may be performed such that the graphical object is essentially “handed-off” from the first animation sequence to the second animation sequence without any, or any significant, perceptible break in the animation of the graphical object from its original position to its updated virtual destination.
In some embodiments, the time periods associated with the animation sequences in response to original and new user inputs may be substantially equivalent, although differing time periods may be used in full accordance with the present techniques. The moving or animating of the graphical object may be performed according to a function including an evaluator, such as generally described above with respect to
An exemplary viewport 110, in which graphical objects may be displayed, is provided in
The animation of a particular graphical or image object 120 of the plurality of graphical objects 112 may be better understood with reference to the diagrams illustrated in
Once a user input is received, a virtual destination 130, at a distance x1 from a starting location 132 (at P22) of the graphical object 120 may be calculated. The graphical object 120 may then be moved or animated from the location 132 to the virtual destination 130 over a given period of time, such as one second, three seconds, or some other amount of time. At the conclusion of such animation, the sequence of graphical objects 112 may be located within the positions 126 as illustrated in diagram 136 of
In an additional embodiment, an additional user input may be received during animation of the graphical object 120 from its starting location 132 to its virtual destination 130, as generally illustrated in diagram 142 of
Additionally, although the examples provided above include the movement of the graphical object 120 (in accordance with a calculated virtual destination) from an actual display position within the viewport 110 to a virtual display position outside of the viewport 110 (
While the invention may be susceptible to various modifications and alternative forms, specific embodiments have been shown by way of example in the drawings and have been described in detail herein. However, it should be understood that the invention is not intended to be limited to the particular forms disclosed. Rather, the invention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the invention as defined by the following appended claims.
Jones, Anne, Heller, David, Dowdy, Thomas
Patent | Priority | Assignee | Title |
Patent | Priority | Assignee | Title |
5379057, | Nov 14 1988 | TYPHONN TUNES, INC | Portable computer with touch screen and computer system employing same |
5619628, | Apr 25 1994 | Fujitsu Limited | 3-Dimensional animation generating apparatus |
5675362, | Nov 14 1988 | TYPHONN TUNES, INC | Portable computer with touch screen and computing system employing same |
6130679, | Feb 13 1997 | Intel Corporation | Data reduction and representation method for graphic articulation parameters gaps |
6133914, | Jan 07 1998 | Interactive graphical user interface | |
6411307, | Jun 02 1997 | Sony Corporation; Sony Electronics, Inc. | Rotary menu wheel interface |
6469723, | Jul 31 1998 | Sony United Kingdom Limited | Video special effects apparatus |
7262775, | May 09 2003 | Microsoft Technology Licensing, LLC | System supporting animation of graphical display elements through animation object instances |
7283138, | Sep 28 2004 | Google Technology Holdings LLC | Method, software and apparatus for efficient polygon rendering |
7412660, | May 11 2000 | Palmsource, Inc. | Automatically centered scrolling in a tab-based user interface |
7593015, | Nov 14 2003 | Kyocera Corporation | System and method for sequencing media objects |
7941758, | Sep 04 2007 | Apple Inc. | Animation of graphical objects |
20020008703, | |||
20020033848, | |||
20020191029, | |||
20040036711, | |||
20040100479, | |||
20050012723, | |||
20050140694, | |||
20050181349, | |||
20060268100, | |||
20070028267, | |||
20070165031, | |||
20070186154, | |||
20080158231, | |||
20080256466, | |||
20080303827, | |||
RE37418, | Jan 14 1999 | Apple Inc | Method and apparatus for synchronizing graphical presentations |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
May 09 2013 | Apple Inc. | (assignment on the face of the patent) | / |
Date | Maintenance Fee Events |
Oct 25 2018 | M1552: Payment of Maintenance Fee, 8th Year, Large Entity. |
Oct 26 2022 | M1553: Payment of Maintenance Fee, 12th Year, Large Entity. |
Date | Maintenance Schedule |
Mar 20 2021 | 4 years fee payment window open |
Sep 20 2021 | 6 months grace period start (w surcharge) |
Mar 20 2022 | patent expiry (for year 4) |
Mar 20 2024 | 2 years to revive unintentionally abandoned end. (for year 4) |
Mar 20 2025 | 8 years fee payment window open |
Sep 20 2025 | 6 months grace period start (w surcharge) |
Mar 20 2026 | patent expiry (for year 8) |
Mar 20 2028 | 2 years to revive unintentionally abandoned end. (for year 8) |
Mar 20 2029 | 12 years fee payment window open |
Sep 20 2029 | 6 months grace period start (w surcharge) |
Mar 20 2030 | patent expiry (for year 12) |
Mar 20 2032 | 2 years to revive unintentionally abandoned end. (for year 12) |