compiled code is stored on a mobile device or embedded in an application on the device. Instructions are assembled in a compiled program and stored on a system for subsequent transmission to the mobile device over a network. An application on the mobile device interfaces with the network to request and receive instructions. The instructions are retrieved from the data store and returned over the connection to the mobile device. After receipt by the mobile device, the instructions determine which code is to be activated. New instructions are downloaded to the device and the new instructions activate different code.
|
27. A system comprising:
a processor;
a network interface for communicating with a network;
an operating system;
a memory storage for the storing of first instructions and second instructions that are not comprised of compiled code for subsequent transmission to a compiled application executing on a mobile device having a display and having a platform for implementing compiled applications on said mobile device;
compiled executable code comprising a plurality of behaviors in portions of code for instantiation into objects, the compiled executable code being associated on the platform with the compiled application and stored in memory storage on the mobile device, the compiled application having access to the platform to implement application components separately from the first instructions and second instructions and the compiled executable code;
wherein the first instructions upon receipt by said compiled application trigger a first instantiation of at least one of said portions of code into a first object for execution by said compiled application, the second instructions upon receipt by said compiled application trigger a second instantiation of at least one of said portions of code into a second object for execution by said compiled application, the second object comprising at least one different said portions of code from the first object;
wherein more than one compiled application is executable on the platform by said operating system and on platforms of a different type than said platform, and capable of using the access to the platform of a different type to implement application components separately and using the first and second instructions for triggering and executing portions of code; and
wherein said first instructions comprise a uniform resource locator (URL) associated with an image for display on the display, and with respect to the second instructions, said triggering of the portions of code comprises instantiating an application component object for display.
19. A method for execution on a device including a processor, said device comprising at least a network interface for communicating with a network, an operating system, a memory storage, a display, a platform for implementing compiled applications on said device, a compiled application and compiled executable code comprising a plurality of behaviors in portions of code for instantiation into objects, said compiled executable code being associated on the platform with said compiled application and stored in said memory storage, the method comprising:
executing the compiled application on the platform by said operating system;
receiving, by said compiled application from said network interface first instructions and second instructions that are not comprised of compiled code, the compiled application having access to the platform to implement application components separately from the first instructions and second instructions and the compiled executable code;
instantiating by said compiled application, triggered upon said receipt of the first instructions, at least one of said portions of code into a first object for execution by said compiled application;
instantiating by said compiled application, triggered upon said receipt of the second instructions, at least one of said portions of code into a second object for execution by said compiled application, the second object comprising at least one different said portions of code from the first object; and
wherein more than one compiled application is executable on the platform by said operating system and on platforms of a different type than said platform, and capable of using the access to the platform of a different type to implement application components separately and using the first and second instructions for instantiating and executing portions of code; and
wherein said first instructions comprise a uniform resource locator (URL) associated with an image for display on the display, and with respect to the second instructions, said instantiating of the portions of code comprises instantiating an application component object for display.
1. A device including a processor, said device comprising:
a network interface for communicating with a network;
an operating system;
a memory storage;
a display;
a platform for implementing compiled applications on said device;
a compiled application that is executable on the platform by said operating system, the compiled application being capable of receiving first instructions and second instructions from the network interface;
compiled executable code comprising a plurality of behaviors in portions of code for instantiation into objects, said compiled executable code being associated on the platform with the compiled application and stored in the memory storage;
wherein the compiled application has access to the platform to implement application components separately from the first instructions and second instructions and the compiled executable code, and the compiled application receives first instructions and second instructions from the network interface that are not comprised of compiled code;
wherein the first instructions upon receipt by said compiled application trigger a first instantiation of at least one of said portions of code into a first object for execution by said compiled application, the second instructions upon receipt by said compiled application trigger a second instantiation of at least one of said portions of code into a second object for execution by said compiled application, the second object comprising at least one different said portions of code from the first object;
wherein more than one compiled application is executable on the platform by said operating system and on platforms of a different type than said platform, and capable of using the access to a platform of a different type to implement application components separately and using the first and second instructions for triggering and executing portions of code; and
wherein said first instructions comprise a uniform resource locator (URL) associated with an image for display on the display, and with respect to the second instructions, said triggering of the portions of code comprises instantiating an application component object for display.
30. A device including a processor, said device comprising:
a network interface for communicating with a network;
an operating system;
a memory storage;
a display;
a platform for implementing compiled applications on said device;
a compiled application that is executable as a background thread on the device independent from other applications on the platform by said operating system, the compiled application being capable of receiving first instructions and second instructions from the network interface;
compiled executable code comprising a plurality of behaviors in portions of code for instantiation into objects, the compiled executable code being associated on the platform with the compiled application and stored in the memory storage;
wherein the compiled application has access to the platform to implement application components separately from the first instructions and second instructions and the compiled executable code, and the compiled application receives first instructions and second instructions from the network interface that are not comprised of compiled code;
wherein the first instructions upon receipt by said compiled application trigger a first instantiation of at least one of said portions of code into a first object for execution by said compiled application, the second instructions upon receipt by said compiled application trigger a second instantiation of at least one of said portions of code into a second object for execution by said compiled application, the second object comprising at least one different said portions of code from the first object;
wherein more than one compiled application is executable on the platform by said operating system and on platforms of a different type than said platform, and capable of using the access to a platform of a different type to implement application components separately and using the first and second instructions for triggering and executing portions of code; and
wherein said first instructions comprise a uniform resource locator (URL) associated with an image for display on the display, and with respect to the second instructions, said triggering of the portions of code comprises instantiating an application component object for display.
2. The device of
3. The device of
5. The device of
6. The device of
7. The device of
8. The device of
9. The device of
10. The device of
a user interface for receiving user input;
wherein a portion of said first instructions are not for display on said display.
11. The device of
12. The device of
13. The device of
14. The device of
15. The device of
16. The device of
17. The device of
18. The device of
20. The method of
21. The method of
22. The method of
a user interface for receiving user input;
a display;
wherein a portion of said first instructions are not for display on said display.
23. The method of
retrieving an image from a URL in said first instructions;
displaying said image on said display; and
changing the displayed location of said image on said display based on input received from said user interface.
24. The method of
retrieving a text file from a URL in said first instructions;
displaying said text file on said display; and
changing the displayed location of said text file on said display based on input received from said user interface.
25. The method of
26. The method of
28. The system of
29. The system of
|
This application is a continuation-in-part of U.S. application Ser. No. 11/949,037, filed on Dec. 2, 2007, now U.S. Pat. No. 8,271,884 which claims the benefit of provisional U.S. Application No. 60/872,898, filed on Dec. 5, 2006, the disclosures of which are incorporated herein in their entirety.
This disclosure relates to the efficient and secure delivery of not compiled code instructions over a network to previously downloaded and running compiled coded in the operating system, application or browser plugin on a device to exhibit and change appearance, functionality and behavior, with application to animation, video and 3D players.
An increasing number of mobile devices are being offered on the market with various operating systems (OS), typically featuring a software developer kit for programming, compiling, and downloading applications to run on the device. Examples of major Software Developer Kits (SDK) presently available include:
All of these SDK have been used by developers to program and compile applications which are downloaded over the air by the user and stored locally on a mobile networked device for subsequent execution by the user. But once downloaded, the application compiled code is limited by the sandbox security model as to the content it can download to the device.
On desktop computer machines, applications can load new classes from external sources while the application is running to alter functionality or content presented to the user. This capability has been misused by developers and has resulted in a vast number of unwanted destructive viruses and adware being installed on desktop computers. Care was taken in writing standards for mobile networked devices to provide more security with what is called the “sandbox” model to prevent this from occurring on these devices. Mobile network application security is important to everyone involved in the industry; the security on mobile networked devices is not likely to be loosened by carriers and software standards for at least the following reasons:
The sandbox security model was developed to prevent downloading disruptive or destructive software (compiled code) to mobile devices. The sandbox security model on mobile networked devices limits applications to only the compiled code that was originally downloaded and installed by the user, and prevents the downloading of additional compiled code from an external source.
While this security model prevents the devious attacks mentioned above, it also prevents the download of additional code to make new functionality available to applications running on the devices. On most mobile platforms, applications are only allowed to download image, byte data, text files and video. Due to the sandbox, games, content and advertising are thus tethered to the code initially downloaded by the user. A game or ad can change the images presented by downloading new image files, but it can't change the behavior of the game or ad while running. To play a different game, display another ad, or animation exhibiting different behavior requires the download and installation of new compiled code on the mobile networked device, thus limiting the extent to which authors of such content can alter the behavior while running on the mobile networked device.
However, some vendors have left security holes open for exploitation. For example, the Android OS allows the download of compiled coded by installed applications, but this is not considered a good practice and it is not a trusted, portable, well performing method. The powerful permissions required, which must be granted by the user, open access by other applications on the device to download and execute malicious code in the name of the application signer. Indeed, present Android devices face threats from downloaded applications and Android is tightening security. Other vendors, such as Apple, closely monitor applications for such security risks before allowing the application to be downloaded, thus ensuring compliance with the sandbox security model. Moreover, the file size of the compiled code is often so large that it introduces performance issues which make it ill-suited for to exhibiting and changing appearance, behavior and functionality on devices.
The downloading of compiled code by an application on a mobile device is presently considered bad practice and largely prohibited. The security restrictions in place on mobile devices which have kept them relatively clear of malicious malware will remain in place; one skilled in the art would understand the risks and not download compiled code in mobile applications.
Accordingly, there presently is a need for an efficient method to deliver and display a plurality of graphical presentations and or advertising and games to mobile networked devices without having to reprogram said mobile networked devices to display each distinct said graphical presentation.
Furthermore, there is a need for an efficient and secure method of downloading not compiled code instructions, within the security sandbox, to trigger capabilities compiled into the previously downloaded code to exhibit and change appearance, behavior and functionality on devices and apply it to a more efficient video player for use with animations and a 3D player on all devices.
Sprite—A term that has become accepted in computer gaming to refer to a protagonist in a game. Sprites are represented with images and movement which change according to code in the compiled application which include, but are not limited to: Random, Vertical, Lateral, Lateral and Vertical Projectile, Rotating Text and Image, Video and Rotating Banner.
Graphical animation capabilities—A set of code which executes an aspect of graphical display and or movement logic on the mobile networked device such as image display, vertical, lateral, both vertical and lateral, video and 3D, or random movement of a graphical image, or removal of image upon collision with another image.
Presentation—A term used in the field of the invention to refer to the graphical rendering and movements produced on the screen of the mobile networked device which is produced by instructions triggering the graphical animation capabilities.
Collection of Presentations—A term used in the field of the invention to refer to instructions ordered to constitute a collection of instructions as a series of Presentations delivered to the mobile networked device.
Perpetrator—A term used to define a Sprite which causes other Sprites to be removed from the Presentation upon collision with it.
Server—A machine on a network which can run compiled code of the invention which accepts connections and can send content and instructions to a mobile networked device or application on a mobile networked device.
Instructions—The delimited integers, characters and bitmasks that trigger the graphical animation capabilities. For a Random Sprite the instructions may include, but not be limited to, the URL address for the image to display, frequency, location and dimensions used to create the Random Sprite, the maximum number of Random Sprites to create and a protagonist that may remove the Random Sprite upon collision.
Mobile networked device—A mobile networked device includes a processor, and possibly display screen, and storage. The device is either physically connected to a network or connected to a network over the air using some type of mobile network technology, such as, but not limited to wireless cellular communication. Such mobility may be accomplished by a person carrying the device or the device being installed in some other component or larger mobile networked device.
Entity—An organization or business with members or customers who would view advertising.
The inventor recognized that the security restrictions enforced on mobile networked devices would severely hinder the variety of graphical Presentations, games, advertising and other graphical content that could be efficiently and securely presented on mobile networked devices. The extent of change an application could affect would be limited to just changing the images and text on the screen or loading new videos or sounds to play. Anything to do with changing the movement and behavior of the download content would involve the download of a new compiled application. The previously discussed programming SDK as delivered by the respective vendors do not provide any code or methods which could be used, as is, to affect a change in animated movement or game logic from what was originally compiled as an application by the developer using the language and then subsequently downloaded by the mobile networked device. Some offer flexibility to download new text, images and or video files, but no functionality exits to enable a complete change in animation, game logic or overall appearance of the Presentation as compiled and installed on the mobile networked device. Thus, mobile application advertising is limited to banner ads without animation.
The inventor further recognized that all graphical animation capabilities on mobile networked devices could be abstracted, compiled and then downloaded and loaded on to the mobile networked device, providing the capability to present all possible movements and behaviors that could take place on the screen. Once loaded, the graphical animation capabilities could be triggered by instructions from a web server to the application code running on the device to present a particular screen action, such as lateral and or vertical movement, random creation of Sprites, projectiles, collisions and other graphic content that are used in games and or ads.
The abstraction of the basic graphical capabilities available in programming languages into graphical animation capabilities provides the ability to present various advertising, games and animation on the screen without downloading new compiled code to the mobile networked device. Desired behavioral characteristics could be triggered in an application running on the mobile networked device by not compiled instructions from a web application server; thus performing within the sandbox designed and enforced by network providers and making more efficient use of limited mobile network bandwidth.
Animations presented by the invention on mobile networked device screens may be part of an application or independent of a particular application. The invention code would be self contained and could be a standalone application or embedded in another application, a browser plug-in or the operating system of the mobile networked device. The invention would operate as a service for the Presentation of games and advertising or a game that imparts advertising material on a mobile networked device. This service, disclosed in the parent application, is referred to as Graphical Animation Advertising and Informational Content Service for Handheld Devices or GADS.
In summary, a method embodying the present disclosure overcomes the limitations of existing technology by providing a more efficient method for the delivery of a plurality of graphical Presentations and or advertising and games on networked devices. While existing technology requires the download of compiled code, HTML or scripting languages to affect logic and behavioral changes to Presentations and games, the present disclosure requires only new not compiled code instructions to accomplish the same changes. When used as a service by entities, hours of programming time is saved producing ads for users who are also spared the inconvenience of massive downloads over limited bandwidth for advertising and games.
Implementations may include one or more of the following features. For example, instructions for one or more Presentations and informational content as a collection of Presentations may be downloaded to the mobile networked device over a mobile network interface. The instructions are then used to download graphical or text elements to the device and present the graphical or text elements on the mobile networked device with movement and animation that may impart an advertising message, a game or other informational content.
According to an embodiment of the disclosure, a set of code defines all aspects of graphical movement in graphical animation capabilities and logic on the mobile networked device, which may include vertical, lateral, both vertical and lateral, random, or removal of images upon collision of images. The combination and instantiation of instructions is ordered to trigger capability in compiled binary code said loaded and running on said device to exhibit and change appearance, functionality and behavior.
Once all instructions for all Presentations in the collection have been loaded in an array in the invention a background process is started on the mobile networked device in accordance with the invention. The code then runs in the background and loops through the objects, creating each Presentation by executing the code for the graphical animation capabilities. The application then presents the graphics on the device screen and then loops through the instantiated types and moves them according to the instructions provided. In this embodiment, the instructions are used to provide a graphical Presentation on the screen of the mobile networked device.
The instructions generally include a time limit. After the Presentation is displayed on the device and the time limit has expired, the instructions for the next Presentation are used to create a new graphical Presentation. Graphical or text elements may be downloaded prior to the expiration of the previous Presentation for inclusion in the next Presentation. With the instructions and graphical or text elements previously downloaded to the mobile networked device from the web application server, a new Presentation is presented on the mobile networked device that is completely different from the previous Presentation.
According to an embodiment of the disclosure, a mobile networked device may be programmed to operate in accordance with a set of code, based upon a programming language, to implement graphical animation capabilities to be triggered by instructions to present graphical animation and informational content on a mobile networked device.
According to another embodiment of the disclosure, a web application server may be programmed to send code to a browser to enable assembly of not compiled code instructions and to send assembled instructions to a server to store the instructions.
According to another embodiment of the disclosure, storage is provided in a server for the not compiled code instructions returned to the web application server from the browser code. The instructions are stored in the server in an order of collections of Presentations.
According to another embodiment of the disclosure, a mobile networked device may be programmed to initiate a network communication to a remote web application server to download the Presentation instructions and URL locations of graphical or text elements to be used to create a collection of Presentations on the mobile networked device, which may be based on geographical locations determined by GPS on said device.
According to another embodiment of the disclosure, another network communication is initiated to a remote web application server, after a specified period of time, to download more instructions and URL locations of graphical or text elements to create a new collection of Presentations on the mobile networked device.
The foregoing aspects and other features of the present invention are explained in the following description, taken in connection with the accompanying drawings, wherein:
Techniques will be described for sending instructions and graphical or text elements to an embodiment of the disclosure running as an application, OS or browser plug-in on a mobile networked device, to trigger graphical animation capabilities to display or otherwise present collections of graphical Presentations to a user of a mobile networked device, which may also include an audio or video component.
Mobile Browsers and Mobile Applications
It will be appreciated that mobile browsers are distinct from mobile applications: An Internet Browser, hereinafter referred to as browser, is an application that; once installed and running on the mobile device, downloads HTML code which requires an interpreter to compile the code into machine language to display and change content on the screen.
However, if a change in the browser application itself is required, a new version must be programmed, compiled and downloaded to the device.
In contrast, the not compiled code instructions described herein do not require a compiler or browser interpreter, but rather trigger previously downloaded compiled code on the mobile device to exhibit and change appearance, functionality and behavior of the application on the mobile device.
Video/Audio Player
In a GADS service embodying the present disclosure, Video/Audio player capability is presented as a Video Sprite which plays a video in a frame within the ad window along with the animation. Accordingly, the GADS service supports the playing of video as part of the presentation without movement or with movement of the video frame along the x and y axis, so that the Video Sprite appears as a participant in the overall animation.
Embodiments as disclosed in the parent application may include video sprites as a participant in the animations. The inventor found, however, that the video players offered in the various platform SDKs are not efficient enough to be included as part of animations: the players often experience annoying delays, are slow to restart at a previous position; may require user interaction with buttons; and moreover do not integrate well with animations presented by the parent application invention. In sum, these platform video players are better suited for large videos that fill the entire screen and users tolerate the delays, not as a quick loading component for animations. Thus the inventor was motivated to extend the efficient and secure method of the invention to exhibit and change video with video frame movement within the overall animation.
To accomplish this, capability to display frame images from a larger image packed file with a set image frame width and height at a frequency that results in the appearance of a video on the device screen is compiled into code and then loaded on a device including a processor, screen and ability to connect to a network. Then the desired sequence of frame images are extracted from a larger video file and packed into an image file. An accompanying audio file may be included that is sequenced by image frames for playback with the video. The capability compiled code is then loaded on to the device and run.
Then not compiled code instructions are assembled, which specify the width and height of each individual image frame packed in the larger image file, along with the frequency to display the frames. Whereby, when these instructions are passed to the compiled code on the device along with the packed image file, the compiled code capability is triggered by the instructions to calculate the total image size and the number of frames which is used along with the frequency to show the images in sequence like a video and possibly play the accompanying audio for the frame. An example of a video frame image sheet is shown in
The video player may be configured to respond to direction from the user interface to reposition the image frame to display either forward or backward within the larger image. This enables the user to reposition the video position for playback or stop the video play at a certain image frame.
By including in the instructions a replay frame number to restart the video when the capability code reaches the end of the image frames, the video will keep playing from that specified replay position to the end or even in reverse order if specified. In the video image sheet example above, the replay position is frame 3. With the subject of the last frame and replay frame in the same position, this results in a smooth replay motion. Furthermore with the video playing, the video frame containing the video can participate in the overall animation of the presentation by specifying in the instructions the starting x and y axis for the parent frame and movement parameters to sequence with the animation.
The use of the not compiled code instructions method to trigger the capability in the compiled code running on the device to exhibit and change the video using variable frame image size, frequency, restart position and movement specified is unique. In the absence of not compiled code instructions of the invention specifying this variable information, each packed image file downloaded to the compiled code on the device would have to have the same individual frame size, start at the same position and replay at the same position, and the same position and movement in the animation.
A GADS service embodying the present disclosure is generally able to play videos using either a platform player or a player as described herein. As noted above, video presentations will generally include movement of the video frame along the x and y axis within the presentation to appear as part of the animation. Presently available platform players provide such video at a relatively much slower rate.
It is understood that video programs may generally include audio. In particular, audio may accompany an animated presentation.
Example: Video Player
In a specific embodiment, content is prepared for the video player by disassembling a real video format file into frames. In practice the video typically has a transparent background, accomplished by shooting it with a green screen background. Then the frames that fit with the ad theme are selected, scaled for the presentation, and packed into a sprite sheet. The accompanying instructions of the invention trigger the downloading of the file and activation of the player, positioning the frame at desired the x and y axis point. Once downloaded, the frames are run through the player at a frame speed independent of the animation to show the video. There is no theoretical limit on the number of frames that can be packed into a sprite sheet, but for downloading using current networks, 30 frames (less than 300 k in total file size) may be considered a practical limit. Once the player reaches the end of the sprite sheet, it may start again at a frame designated by the ad creator: the first frame, a frame that is synchronized with the movement of the last frame, which could be the first or middle frame; or it may play the video in reverse. With each frame movement, the x and y axis of the video frame can be adjusted to make the frame move within the overall animation. An audio portion could be added to match each frame and be played and adjusted along with the frames.
3D Player
In another embodiment, the not compiled code instructions of the invention can also be used to trigger capability programmed, compiled, loaded and run on a mobile device, including processor, screen and ability to connect to a network, to display image frame 3D views packed into an image sheet organized in rows and columns with set frame image width and height (see
As with the video player, the not compiled code instructions includes the frame size of each 3D view packed into the image sheet as in the example table above. The instructions trigger capability in the compiled code on the device to display the images in sequence or using direction from the user interface: Tilt Up=minus a row, Tilt Dn=plus a row; Left=minus one column; and Right=plus a column. For example, when at the center row and column Center 4 Center 4 and direction from the user interface indicates Tilt Up and to the Left, it displays the 3D image view frame Tilt Up 3 Side 3. Moreover, the frame size and total image size in the instructions are used by the capability in the compiled code to compute boundaries that prevent a call for the display of a nonexistent frame outside of the total number of rows and columns or beyond zero and either stop the Tilt Up, Tilt Dn or Left, Right movement; or roll over to the opposite view (i.e. Tilt Dn. 8 Side 8 becomes Tilt Up 0 Side 0).
Note that in packing the image file sheet, the mirror effect needs to be take into account and the Side 0 is actually the right most visual and Side 8 is the left most visual side. The example above does not include back side views, but that is certainly possible with a much larger packed image file.
The instructions of the invention trigger the previously downloaded capability in the compiled code to exhibit and change 3D presentations. Using the specified width and height size of each individual 3D view and the initial frame to display in the not compiled code instructions, the capability compiled code on the device is triggered to calculate the number of rows and columns. This enables the compiled code previously downloaded to the mobile device to display any configuration of 3D image sheets and with a different start view.
In the absence of the not compiled code instructions of the invention to trigger the capability in the compiled code previously downloaded to the mobile device, each 3D sheet would have to have the same 3D view frame size and the same start view. But, using the instructions of the invention, which are downloaded to the compiled code running on the device along with the image sheet, the capability in the compiled code on the device is triggered to exhibit and changes the appearance, functionality and behavior of the 3D views on the device.
Bandwidth Governor
The device may also be configured with a bandwidth governor that downloads ads that fit the available bandwidth at the user location. For example, advertisers may have 3 versions of an ad: Below 3G low bars, 3G high bars and 4G and above or Wi-Fi.
User Interaction
Various forms of user interaction are available on devices such as touch, keyboard voice and other user interaction will certainly be available in the future. In this continuation in part, the drawings and claims have been updated to incorporate user interaction capability which could include, but not limited to keyboard, touch screen, voice and other future interaction.
The not compiled code instructions for each presentation specify the user interaction capability to be triggered in the compiled code by a listener that waits on interface events. For example, the instructions may specify that one presentation does not allow for the expansion of the frame, while the next one does.
A user interface in an embodiment of the disclosure may include one or more of the following features:
Taken in concert with features previously disclosed in the parent application, the Video and 3D player become a different type of animation to display, thus having one animation with video and 3D player and others without.
Application to Robotics
In alternative embodiments, instructions as described above may direct robots in different tasks. Present-day robots are largely single purpose, with each one being programmed and physically designed to perform a specific task or function. These robots perform different tasks either by being reprogrammed entirely, or by passing parameters to a control program for the robot.
For robots constructed for multiple purposes and carrying out different roles, all the logic for each role could be programmed and loaded in the robot as described herein with regard to appearance, functionality and behavior, comprising: animation, color, physical configuration, logic, interface, user interface, and artificial intelligence. The invention could then be used to deliver instructions that triggers the logic previously downloaded to the robot for the appearance, functionality and behavior of the role it should perform.
Detailed Description: Drawings
Specific processes embodying the disclosure are schematically illustrated in the drawings, as detailed below.
The invention repeats steps 1034 through 1058 until all the instructions sent by the web application server have been read and processed. The invention then terminates the background thread for loading instructions and creates a new background thread for running the Presentation collection and then follows the instructions sent by the web application server to begin the first Presentation (steps 1060 through 1064).
The instructions for a Banner Sprite must include the URL and text to be displayed (step 350). The instructions may also include text for the Sprite (step 350). A Banner Text Sprite may also move either laterally or vertically or both (steps 352 through 362). The image and or the text on the Banner Text Sprite can change during the Presentation (steps 364 through 374) If the Sprite has a perpetrator defined that is read from the instructions (steps 376 and 380). After the instructions for the Banner Text Sprite have been read, it is saved in the Presentation (step 382).
The invention then loops through the Presentation types defined in the instructions and instantiates each Presentation Sprite, Background, video or 3D using the graphical capabilities of the programming language of the implementation (steps 1164 through 1204) and stores them in memory (step 1206). The various types are read and stored until all types for the Presentation have been processed (steps 1208 to step 1164). The Presentation content is then downloaded in a background thread (step 1210) and displays internally stored presentation or loading message until the content is ready (1212 through 1214). After all the Presentation Sprites have been created, the invention enables the user interaction on the device (step 1216) and starts the Presentation (step 1218 and goes to
If the Presentation time has elapsed (step 1302), the invention removes all objects from the screen, clears out the Presentation array and reclaims memory (step 1304). The invention then checks to see if this was the last Presentation (step 1306) and if not, it begins to build the next Presentation (
If the time between Random Sprite creation has not expired (step 560), then control is returned to the Presentation Control Loop (step 572). If it is time to create a Random Sprite, a check is done to see if the maximum Sprite count has been exceeded (step 562). If the count is above the maximum count defined, then control is returned to the Presentation Control Loop (step 572). Otherwise, a check is made to see if image change has been defined for the Random Sprite creation (step 564) and if so, the image is rotated (step 566). The random X and Y coordinates for placement of the Random Sprite on the Presentation are calculated (step 568) and the Random Sprite is created (step 570). Control is then returned to the Presentation Control Loop (step 572).
A check is then done to see if image change has been defined for the Vertical Moving Sprite (step 610) and if so, the image is rotated (step 612). The Y coordinate is then determined for placement according to the instructions and any keyboard input (step 614) and the Sprite is moved (step 616). Control is then returned to the Presentation Control Loop (step 618).
A check is then done to see if image change has been defined in the instructions for the Lateral Sprite (step 660) and if so, the image is rotated (step 662). The X coordinate is then determined for placement according to the instructions and any keyboard input (step 664 and 666). Control is then returned to the Presentation Control Loop (step 668).
The X and Y coordinate of the Sprite designated as the Firing Sprite is then determined for origination of the movements of the Projectile Sprite (step 708). The X and Y coordinates is then determined for placement in reference to the Firing Sprite location according to the instructions and any keyboard input (step 710 and 712). A check is then done to see if image change has been defined for the Projectile Sprite (step 714) and if so, the image is rotated (step 716). The Projectile Sprite is then moved (step 718) and control is then returned to the Presentation Control Loop (step 720).
A check is then done to see if image change has been defined in the instructions for the Rotating Banner Text Sprite (step 759) and if so, the image is rotated (step 760). If rotate text has been defined for the Rotating Banner Text Sprite (step 762), the text is rotated (step 764).
The X and Y coordinates are then determined for placement according to the instructions and any keyboard input (step 766 and 768). Rotating Banner Text Sprite is then moved on the Presentation (step 770). The Control is then returned to the Presentation Control Loop (step 772).
A check is then done to see if image change has been defined for the Rotating Banner Sprite (step 810) and if so, the image is rotated (step 812).
The X and Y coordinates are then determined for placement according to the instructions and any keyboard input (step 814 and 816). Rotating Banner Sprite is then moved on the Presentation (step 818). Control is then returned to the Presentation Control Loop (step 820).
A check is then done to see if image change has been defined for the Lateral Vertical Sprite (step 860) and if so, the image is rotated (step 862). The X and Y coordinates are then determined for placement according to the instructions and any keyboard input (step 864 and 866). The Lateral Vertical Sprite is then moved on the Presentation (step 868). Control is then returned to the Presentation Control Loop (step 870).
Once all types have been instantiated, content for all types is downloaded in a background thread (step 1696 through 1700). Then the interfaces defined in the types are activated (step 1701) and triggering begins (step 1702).
In accordance with the present disclosure, the ability to display and change animation as well as the video and 3D players is accomplished by the efficient and secure delivery of not compiled code instructions to trigger capability in compiled code previously downloaded and running on the mobile device. In a particular embodiment, the device will have specific capabilities (e.g., frame size, movement and timing), and each specific capability performs as a component of broader capability to display and change appearance, functionality and behavior on the device.
In the embodiments discussed herein, the instructions trigger execution of selected capability in compiled code loaded and running on the device to enable various animation Video and 3D player features. It will be appreciated, however, that such instructions can also be used to trigger capability to exhibit and change any appearance, functionality and behavior on a device such as color, physical configuration, logic, interface, user interface, and artificial intelligence.
In view of the foregoing, it will also be appreciated that the efficient and secure delivery of not compiled code instructions to trigger the execution of compiled binary code previously loaded and running on a device is a method in itself; and instructions could be devised to trigger any capability, similarly compiled and running, which display and change appearance, functionality and behavior. In particular, one could compile into binary code any appearance, functionality and behavior capability in combination with network communication and a listener that waits on interface events; load the code on a device; and then run the compiled binary code on the device; which requests and receives over a network at any time one or more not compiled code instructions; that trigger one or more compiled binary code capability loaded and running on the device to exhibit and change appearance, functionality and behavior on the device.
Therefore it is contemplated that instructions, which are characterized as other than compiled code, HTML or scripting language, may be used to trigger or activate capability in previously downloaded code to exhibit and change any appearance, behavior and functionality.
Accordingly, any appearance, functionality and behavior can be exhibited and changed on a device by new instructions requested and received at any time by compiled binary code loaded and running on the device, triggering different appearance, functionality and behavior capability in the compiled binary code loaded and running on the device, without having to recompile the code, download new code, or reprogram said device, and without the user having to download a new compiled application.
It should be understood that the foregoing description is only illustrative of the invention. Various alternatives and modifications can be devised by those skilled in the art without departing from the invention. Accordingly, the present invention is intended to embrace all such alternatives, modifications and variances which fall within the scope of the appended claims.
Patent | Priority | Assignee | Title |
11265403, | Sep 19 2005 | GOOGLE LLC | Customized data retrieval applications for mobile devices providing interpretation of markup language data |
Patent | Priority | Assignee | Title |
6278466, | Jun 11 1998 | Cisco Technology, Inc | Creating animation from a video |
6314451, | Jan 26 1999 | ANDREAS ACQUISITION LLC | Ad controller for use in implementing user-transparent network-distributed advertising and for interstitially displaying an advertisement so distributed |
6362817, | May 18 1998 | IN3D Corporation | System for creating and viewing 3D environments using symbolic descriptors |
7437149, | Mar 24 2003 | Sprint Spectrum LLC | Method and system for exchanging data between portable applications for mobile devices |
7783729, | Mar 19 2004 | SITO MOBILE LTD | Transmitting mobile device data |
7907966, | Jul 19 2005 | Meta Platforms, Inc | System and method for cross-platform applications on a wireless phone |
7913234, | Feb 13 2006 | Malikie Innovations Limited | Execution of textually-defined instructions at a wireless communication device |
8694925, | Oct 05 2005 | GOOGLE LLC | Generating customized graphical user interfaces for mobile processing devices |
8781532, | Sep 19 2005 | GOOGLE LLC | Customized data retrieval applications for mobile devices providing interpretation of markup language data |
20050033728, | |||
20050050474, | |||
20050079863, | |||
20050104886, | |||
20050162431, | |||
20050215238, | |||
20060256130, | |||
20060294258, | |||
20070006136, | |||
20070011334, | |||
20070038728, | |||
20070066364, | |||
20070067373, | |||
20070078810, | |||
20070106627, | |||
20070113181, | |||
20070288424, | |||
20070288853, | |||
20080016176, | |||
20080046557, | |||
20080148153, | |||
20080313282, | |||
20090132640, | |||
20120036483, | |||
20120190458, | |||
20130286025, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Date | Maintenance Fee Events |
May 05 2023 | M3551: Payment of Maintenance Fee, 4th Year, Micro Entity. |
Date | Maintenance Schedule |
Jan 21 2023 | 4 years fee payment window open |
Jul 21 2023 | 6 months grace period start (w surcharge) |
Jan 21 2024 | patent expiry (for year 4) |
Jan 21 2026 | 2 years to revive unintentionally abandoned end. (for year 4) |
Jan 21 2027 | 8 years fee payment window open |
Jul 21 2027 | 6 months grace period start (w surcharge) |
Jan 21 2028 | patent expiry (for year 8) |
Jan 21 2030 | 2 years to revive unintentionally abandoned end. (for year 8) |
Jan 21 2031 | 12 years fee payment window open |
Jul 21 2031 | 6 months grace period start (w surcharge) |
Jan 21 2032 | patent expiry (for year 12) |
Jan 21 2034 | 2 years to revive unintentionally abandoned end. (for year 12) |