There is provided an apparatus for converting an object display description document, which enables to reduce load for displaying an image and a capacity necessary for storing document data. A browser including the apparatus is also provided. The apparatus comprises a generating means for generating, from a set of source objects in a document, a set of new objects to obtain a display image equivalent to a display image obtained from the set of source objects. The new objects are fewer than the set of source objects.

Patent
   7667716
Priority
Dec 09 1998
Filed
Dec 06 1999
Issued
Feb 23 2010
Expiry
Jul 17 2023
Extension
1319 days
Assg.orig
Entity
Large
0
42
EXPIRED
26. A method for converting an original set of source objects by reducing the number of objects required to display a description document, said method comprising a step of using a processor to generate a set of new objects, from said original set of source objects in the document, a number of said new objects forming a set of new objects fewer than a number of said source objects forming said original set of source objects, to obtain a display image equivalent to the display image obtained from said set of source objects,
wherein said generation step generates said new objects from a semi-transparent source object and other source objects not semi-transparent and located at a layer lower than a layer including said semi-transparent source object and spatially overlapping said semi-transparent source object,
wherein said generating step generates a new merged object including at least a first source object and a second source object superimposed on said first source object.
9. A method for converting an original set of source objects by reducing the number of objects required to display a description document, said method comprising a step of using a processor to generate a set of new objects, from said original set of source objects in the document, a number of said new objects forming a set of new objects fewer than a number of said source objects forming said original set of source objects, to obtain a display image equivalent to the display image obtained from said set of source objects,
wherein said generation step generates said new objects from a semi-transparent source object and other source objects located at a layer lower than a layer including said semi-transparent source object and spatially overlapping said semi-transparent source object,
wherein said generating step generates a new merged object including at least a first source object having an area and a second source object having an area and superimposed on said first source object.
25. An apparatus for converting an original set of source objects by reducing the number of objects required to display a description document, said apparatus comprising a generating means for generating a set of new objects, from said original set of source objects in the document, a number of new objects in said set of new objects being fewer than a number of objects in said original set of source objects, said fewer objects obtaining a display image equivalent to the display of an image obtained from said original set of source objects,
wherein said generating means generates said new objects from a semi-transparent source object and other source objects not semi-transparent and located at a layer lower than a layer including said semi-transparent source object and spatially overlapping said semi-transparent source objects,
wherein said generating means generates a new merged object including at least a first source object and a second source object superimposed on said first source object.
27. A storage device having stored thereon a computer program for causing a computer to execute a method for converting an object display description document by reducing the number of objects required for the display, said method comprising a generation step of generating, from an original set of source objects in the document, a set of new objects which are fewer than a number of said objects forming said original set of source objects, in order to obtain a display image equivalent to the display image obtained from said original set of source objects,
wherein said generation step generates new objects from a semi-transparent source object and other source objects not semi-transparent and located at a layer lower than a layer including said semi-transparent source object and spatially overlapping said semi-transparent source object,
wherein said generating step generates a new merged object including at least a first source object and a second source object superimposed on said first source object.
1. An apparatus for converting an original set of source objects by reducing the number of objects required to display a description document, said apparatus comprising a generating means for generating a set of new objects, from said original set of source objects in the document, a number of new objects in said set of new objects being fewer than a number of objects in said original set of source objects, said fewer objects obtaining a display image equivalent to the display of an image obtained from said original set of source objects,
wherein said generating means generates said new objects from a semi-transparent source object and other source objects located at a layer lower than a layer including said semi-transparent source object and spatially overlapping said semi-transparent source object,
wherein said generating means generates a new merged object including at least a first source object having an area and a second source object having an area and superimposed on said first source object.
17. A storage device having stored thereon a computer program for causing a computer to execute a method for converting an object display description document by reducing the number of objects required for the display, said method comprising a generation step of generating, from an original set of source objects in the document, a set of new objects which are fewer than a number of said objects forming said original set of source objects, in order to obtain a display image equivalent to the display image obtained from said original set of source objects,
wherein said generation step generates new objects from a semi-transparent source object and other source objects located at a layer lower than a layer including said semi-transparent source object and spatially overlapping said semi-transparent source object,
wherein said generating step generates a new merged object including at least a first source object having an area and a second source object having an area and superimposed on said first source object.
2. The apparatus as recited in claim 1, wherein said generating means deletes source objects hidden spatially behind another source object which is not semi-transparent.
3. The apparatus as recited in claim 1, wherein generation of said new object from said semi-transparent source object and said other source objects is performed for a time range in which said semi-transparent source object spatially overlaps said other source objects.
4. The apparatus as recited in claim 1, wherein said generating means deletes a source object when a display time for said source object is out of a display time range for said set of source objects.
5. The apparatus as recited in claim 1, further comprising a means for storing said set of new objects to a storage medium.
6. The apparatus as recited in claim 1, further comprising a means for selectively storing said set of source objects or said set of new objects to a storage medium.
7. The apparatus as recited in claim 1, further comprising a means for displaying said set of new objects, wherein said apparatus is used as a browser.
8. The apparatus as recited in claim 1, further comprising a means for selectively displaying said set of source objects or said set of new objects, wherein said apparatus is used as a browser.
10. The method as recited in claim 9, wherein said generation step deletes source objects hidden spatially behind another source object which is not semi-transparent.
11. The method as recited in claim 9, wherein generation of said new object from said semi-transparent source object and said other source objects is performed for a time range in which said semi-transparent source object spatially overlaps said other source objects.
12. The method as recited in claim 9, wherein said generation step deletes a source object when a display time for said source object is out of a display time range for said set of source objects.
13. The method as recited in claim 9, further comprising a step of storing said set of new objects to a storage medium.
14. The method as recited in claim 9, further comprising a step of selectively storing said set of source objects or said set of new objects to a storage medium.
15. The method as recited in claim 9, further comprising a step of displaying said set of new objects.
16. The method as recited in claim 9, further comprising a step of selectively displaying said set of source objects or said set of new objects.
18. The storage device program as recited in claim 17, wherein said generation step deletes source objects hidden spatially behind another source object which is not semi-transparent.
19. The storage device as recited in claim 17, wherein generation of said new object from said semi-transparent source object and said other source objects is performed for a time range in which said semi-transparent source object spatially overlaps said other source objects.
20. The storage device program as recited in claim 17, wherein said generation step deletes a source object when a display time for said source object is out of a display time range for said set of source objects.
21. The storage device as recited in claim 17, further comprising a step of storing said set of new objects to a storage medium.
22. The storage device as recited in claim 17, further comprising a step of selectively storing said set of source objects or said set of new objects to a storage medium.
23. The storage device as recited in claim 17, further comprising a step of displaying said set of new objects.
24. The storage device as recited in claim 17, further comprising a step of selectively displaying said set of source objects or said set of new objects.

1. Field of the Invention

The present invention relates to an apparatus and method for converting an object display description document having functions of spatial synchronization and time synchronization and more particularly, to an apparatus and method for converting an object display description document in conformity with such a standard as MHEG (Multimedia and Hypermedia Expert Group)-5, DHTML (Dynamic Hyper Text Markup Language), and SMIL (Synchronized Multimedia Integration Language).

2. Description of the Prior Art

JPA 9-503088 entitled “APPARATUS AND METHOD FOR BROWSING INFORMATION” discloses a conventional system for optimizing data obtained at a viewer or browser for a document described with an object display description language.

As shown in FIG. 1, this reference exemplifies information browsing system 101 for browsing and organizing information from plural information sources 123. Knowledge base 109, which includes information source description 113, worldview 115 and system network view 117, is used for establishing a query plan including a sub plan. When the query sub plan is executed, the query plan is optimized by pruning redundant sub plans in accordance with information browsed by the executed sub plan. Graphical user interface 103 includes a hypertext browser integrated at a knowledge base browser/editor. Graphical user interface 103 allows a user to store an information source description in knowledge base 109 via a graphic operation and to browse information source descriptions previously stored. System 101 provides the use of query results that are constructed for avoiding of inquiring on constructed information sources and of concentrating queries from hypertext browsers on associated data sources that are not constructed. According to the above method, such an object is attained that information needless to inquire is fetched from the information source as seldom as possible.

With respect to an arrangement of sending optimized data from a server, JPA 10-171730 entitled “METHOD OF TRANSFERRING IMAGE” discloses an image transfer method capable of reducing the network load to transfer image data when a high resolution data is required at a client. As shown in FIG. 2, data received at the client is converted into an optimized high-resolution image using image conversion plug-in module 308. In this case, screen resolution setting program 306 is employed to check a resolution of a display device at the client. The result from program 306 is sent to image conversion program 304. The sent data becomes the most easy-to-view and high-qualified image at a browser watched from a reader. Although the optimization at the server side is considered in such a manner, all of servers do not send optimized data. Thus, users are required to display or store received data.

The above conventional technology has the following disadvantages.

A first disadvantage may be caused when the user selects necessary data with reference to its contents and obtains minimal necessary data while the minimal data contains data unnecessary to output or display for the user to the last. In such a case, the viewer or browser is loaded to work for analyzing and displaying the data unnecessary for showing the user. Therefore, a display output is delayed, and a data capacity is increased in case of storing data that also includes unnecessary data.

This is because the conventional viewer/browser has been developed to interpret the data sent from the server and make a display output. Therefore, thus configured viewer/browser analyzes all the obtained data and makes a display output even if all of the data are not required to be displayed because of introduced functions of spatial synchronization and time synchronization.

A second disadvantage is that although the server optimizes individual object data including an image as far as possible, the server does not perform optimization in consideration of a combination of data to be displayed. In this case, the data simply becomes a motion picture (object) group with a plurality of motion pictures (objects) but does not become substantially optimized data when images overlap with each other as the whole data.

This is because the server can superimpose the motion pictures and the like at the same time by the functions of spatial synchronization and time synchronization. In addition, when the server makes data to send, it sends the data with information, which indicates superimposition of the corresponding plural motion pictures, to the user.

In order to overcome the aforementioned disadvantages, the present invention has been made and has an object to provide an apparatus and method for converting an object display description document, capable of reducing load for displaying an image and reducing a capacity necessary for storing document data, and a browser including the apparatus.

According to a first aspect of the present invention, there is provided an apparatus for converting an object display description document, comprising a generating means for generating, from a set of source objects in the document, a set of new objects fewer than said set of source objects, to obtain a display image equivalent to the display image obtained from said set of source objects.

According to a second aspect of the present invention, there is provided a method for converting an object display description document, comprising a generation step of generating, from a set of source objects in the document, a set of new objects fewer than said plurality of source objects, to obtain a display image equivalent to the display image obtained from said set of source objects.

According to a third aspect of the present invention, there is provided a computer program for causing a computer to execute a method for converting an object display description document, said method comprising a generation step of generating, from a set of source objects in the document, a set of new objects fewer than said set of source objects, to obtain a display image equivalent to the display image obtained from said plurality of source objects.

Other features and advantages of the invention will be apparent from the following description of the preferred embodiments thereof.

The present invention will be more fully understood from the following detailed description with reference to the accompanying drawings in which:

FIG. 1 is a block diagram showing a conventional technology;

FIG. 2 is a block diagram showing another conventional technology;

FIG. 3 is a block diagram showing a structure of an object display description document converter and image browser according to an embodiment of the present invention;

FIG. 4 is a flowchart showing an operation of the object display description document converter and image browser according to the embodiment of the present invention;

FIG. 5 is a flowchart explaining a detailed operation of display process A8 shown in FIG. 4;

FIG. 6 is a flowchart explaining a detailed operation of saving process A10 shown in FIG. 4;

FIG. 7 shows a screen for selecting the use of an optimization function employed at a selecting means 8 and user input means;

FIG. 8 is a table showing the optimization analysis contents analyzed by an optimization analysis means 5 of FIG. 3 and methods and process numbers corresponding to the analysis contents;

FIG. 9 is a table showing the contents of an optimization process executed by an optimization means 6 of FIG. 3;

FIG. 10 exemplifies an input object display description document input from a data input means 1 of FIG. 3;

FIG. 11 exemplifies an optimized object display description document produced from the optimization analysis means 5 and optimization means 6 based on the input object display description document shown in FIG. 10; and

FIG. 12 is a block diagram showing another structure of an object display description document converter and image browser according to an embodiment of the present invention;

In case where a user (client) displays data sent from a sender (server) at a browser or viewer, or the user stores the data, the present invention automatically optimizes the data sent from the sender after it is input. Thus, the present invention provides a system configuration capable of displaying the same contents as the incoming data at a higher speed and of storing the incoming data to a storage with smaller data capacity.

FIG. 3 is a block diagram of an embodiment of the present invention. FIG. 3 shows such a viewer system, in which data input from data input means 1 is saved in input data saving (temporary storing) means 2, then analyzed by data analysis means 3, and output for display from output means 4. The system further comprises optimization analysis means 5, optimization means 6, optimized data saving (temporary storing) means 7, selection means 8, user input means 9 and storage (proper storing) means 10.

Selection means 8 allows the user to select whether or not to work an optimization function. When the user selects from user input means 9 to put on the optimization function, optimization analysis means 5 analyzes if there is a portion capable of being optimized in the data saved in input data saving means 2. Optimization means 6 performs an optimization process and thereafter saves optimized data into optimized data saving means 7. In case of displaying the saved data, it is analyzed by data analysis means 3 and displayed at output means 4. In case of storing the saved data, it is stored in storage means 10. When the user selects to put off the optimization function, the input data is analyzed by data analysis means 3 and displayed at output means 4 or stored in storage means 10. The optimization processes herein mean optimization processes corresponding to the time synchronization/spatial synchronization, which include a process for deleting objects hidden behind the other object that is spatially displayed at the uppermost location and hides the objects to be deleted in time and spatially, for example. It also includes a process for deleting an object to be displayed in a range exceeding a time for display in time, and a process for making one motion picture object from plural superimposed motion picture objects.

Thus, by automatically optimizing the obtained data at the viewer or browser to improve the data in consideration of display by the viewer or browser, the very same contents as the obtained data can be displayed at a high speed and stored in compact.

FIG. 3 is a block diagram of an embodiment of an automatic optimization display and storing system according to the present invention. The system comprises data input means 1, input data saving means 2, data analysis means 3 and output means 4, optimization analysis means 5, optimization means 6, optimized data saving means 7, selection means 8, user input means 9, and storage means 10.

An operation of each part will be described next.

Data input means 1 inputs data of contents. Input data saving means 2 saves the input data. Data analysis means 3 analyzes the data based on a format with which the data was produced. Output means 4 displays or prints out the analyzed data. Optimization analysis means 5 performs an analysis for searching data to optimize and optimization means 6 optimizes the data. Optimization analysis means 5 and optimization means 6 operate in multi-process in association with each other. Optimized data saving means 7 saves the optimized data. Selection means 8 outputs information associated with allowing the user to select whether or not to employ the optimization as a function and reflects the user's selection. User input means 9 accepts the selection input that indicates whether the user employs the optimization function from the user. Storage means 10 stores the data and is such as a hard drive and the like.

The whole operation of the embodiment will be described in detail with reference to FIGS. 3 and 4 next.

First, it is executed to input data from data input means 1 (at step A1). Next, it is executed to save the input data in input data saving means 2 (at step A2).

The optimization function has been previously selected to put on or off by the user from selection means 8 and user input means 9. If the user selected to put on the optimization function (YES at step A3), optimization analysis means 5 and optimization means 6 operate in association with each other so that optimization analysis means 5 searches data to be optimized among the data saved in input data saving means 2 and optimization means 6 optimizes the data to be optimized (at step A4). When optimization analysis means 5 finally detects that no data remains to be optimized, the optimized data is saved in optimized data saving means 7 (at step A6). If the user selected to put off the optimization function (NO at step A3), steps A4 and A6 are omitted. Next, it is executed to determine to display or not (at step A7). If determined to display, a display process is performed (step A8). If determined not to display, the process advances to a determination of saving (at step A9). If determined to save, a saving process is performed (at step A10) and then the process ends. If determined not to save, the process directly ends.

An embodied example will be described next.

FIG. 5 is a flowchart of an embodied example of display process A8. FIG. 6 is a flowchart of an embodied example of saving process A10. FIG. 7 shows an example of a screen for use in allowing the user to select ON or OFF of the optimization function. FIG. 8 shows an embodied example of the contents to be analyzed at optimization analysis means 5. FIG. 9 shows an embodied example of the contents of process executed at optimization means 6. FIG. 10 exemplifies the data saved in input data saving means 2. FIG. 11 exemplifies the data saved in optimized data saving means 7.

The case where the optimization function is OFF will be described first employing an embodied example. It is executed to input data from data input means 1 (A1). The input data, which is referred with numeral 81 as shown in FIG. 10, includes:

<Start of data>

<Display all of the following data from 0 to 10 min.>

<Still picture file AAA, Display from 5 min. to 10 min., Coordinates (10, 10), Size 20×30, Priority 2>

<Text file BBB, Display from 7 min. to 9 min., Coordinates (10, 10), Size 10×10, Priority 3>

<Motion picture file CCC, Display from 5 min. to 10 min., Coordinates (10, 10), Size 20×40, Priority 1, Translucent>

<Motion picture file DDD, Display from 15 min. to 20 min., Coordinates (0, 0), Size 10×10, Priority 1>

<End of data>

Input data saving means 2 saves this data (A2). If the optimization function is OFF (NO at A3), then the process advances to the determination of display (A7). If the display is not required, then the process advances to the determination of saving (A9). If the display is required, then the display process is performed (A8).

FIG. 5 shows a flow of display process A8. As the optimization display is OFF (NO at B1), it is executed to directly select the input data from input data saving means 2 (B3). Then, it is executed to analyze the input data by data analysis means 3 (B4) and to display the analyzed data at output means 4 (B5).

In this case, data analysis means 3 analyzes the data contents of <Start of data> as that “data starts here”. It also analyzes the data contents of <Display all of the following data from 0 to 10 min.> as that “start to display the whole contents from here at 0 min. and finish to display at 10 min.”. It further analyzes the data contents of <Still picture file AAA, Display from 5 min. to 10 min., Coordinates (10, 10), Size 20×30, Priority 2> as that “display a still picture file AAA on a location at coordinates (10, 10) with a size of 20×30 from 5 min. to 10 min. with the second priority from the top”. It also analyzes the data contents of <Text file BBB, Display from 7 min. to 9 min., Coordinates (10, 10), Size 10×10, Priority 3> as that “display a text file BBB on a location at coordinates (10, 10) with a size of 10×10 from 7 min. to 9 min. with the third priority from the top”. It further analyzes the data contents of <Motion picture file CCC, Display from 5 min. to 10 min., Coordinates (10, 10), Size 20×40, Priority 1, Translucent> as that “display a motion picture file CCC in translucent on a location at coordinates (10, 10) with a size of 20×40 from 5 min. to 10 min. with the first priority from the top”. It also analyzes the data contents of <Motion picture file DDD, Display from 15 min. to 20 min., Coordinates (0, 0), Size 10×10, Priority 1> as that “display a motion picture file DDD on a location at coordinates (0, 0) with a size of 10×10 from 15 min. to 20 min. with the first priority from the top”. Data analysis means 3 further analyzes the data contents of <End of data> as that “data finishes here” and output means 4 displays the contents of these analyzed results. Next, it is executed to advance to the saving process determination (A9). If not required to save, then a series of process finishes. If required to save, then the process advances to the saving process (A10). FIG. 6 provides a flowchart showing saving process A10. As the optimization saving is determined NO (C1), it is executed to select the data from input data saving means 2 (C3) and to store the selected data into storage means 10 (C4). Then, the storing process finishes and the series of process terminates.

An embodied example of the case where the optimization function is ON will be described next. FIG. 7 is an imaging diagram of user display message 51 for allowing the user to select ON or OFF of the optimization function through selection means 8 from user input means 9. In this case, “Use of optimization function” is selected ON, “Use for display” ON, and “Use for saving” OFF. The data denoted with reference numeral 81 of FIG. 10 is input from data input means 1 (A1) and saved in input data saving means 2 (A2). Then, the process advances to YES through determining if the optimization function being ON (YES at A3). Optimization analysis means 5 analyzes to optimize and optimization means 6 performs the optimization process. An embodied example of the optimization analysis performed by optimization analysis means 5 is shown in FIG. 8 with reference numeral 61, by which the contents are analyzed in consideration of time synchronization and spatial synchronization.

If the analysis contents are that “1. Plural objects superimpose, display times for other objects are completely included within a display time for the uppermost object, a size of the uppermost object is larger than sizes of other objects, and the uppermost object hides other objects even considering display locations”, 1-1 or 1-2 is adaptive:

1-1. If the uppermost object is not transparent nor translucent, then Process 1 is selected for other objects (see FIG. 9); and

1-2. If the uppermost object is transparent or translucent, then it is executed to remove the uppermost object and analyze analysis contents 1 for the remaining plural objects. The term “removing” herein differs from the term “deleting” used in Process 1 and means to remove from objects to be analyzed.

If the analysis contents are that “2. Plural objects superimpose in time and spatially, and the uppermost object is translucent or transparent”, 2-1 or 2-2 is adaptive:

2-1. If a motion picture is contained, it is executed to select Process 2 for all objects over the entire superimposing time; and

2-2. If no motion picture is contained, it is executed to select Process 3 for all objects over the entire superimposing time.

If the analysis contents are that “N. A display time for an object is completely in excess of the whole display time range”, Process 1 is performed for that object.

FIG. 9 shows an embodied example imaging of the optimization process contents with reference numeral 71, which includes process numbers and corresponding process contents: “Process 1” corresponds to “Deleting object”; “Process 2” to “Making a new motion picture from a motion picture consisting of plural superimposed objects”; “Process 3” to “Making a new still picture from a still picture consisting of plural superimposed objects”; and “Process M” to “Contents M”.

The data in the above-embodied example corresponds to the optimization analysis and optimization process (at step A4 in FIG. 4) as below. The optimization analysis contents of the data with numeral 81 in FIG. 10 include three object descriptions:

<Still picture file AAA, Display from 5 min. to 10 min., Coordinates (10, 10), Size 20×30, Priority 2>; <Text file BBB, Display from 7 min. to 9 min., Coordinates (10, 10), Size 10×10, Priority 3>; and <Motion picture file CCC, Display from 5 min. to 10 min., Coordinates (10, 10), Size 20×40, Priority 1, Translucent>. The condition for the data corresponds to “1-2. If the uppermost object is transparent or translucent, then it is executed to remove the uppermost object and analyze analysis contents 1 for the remaining plural objects”. Then, it is executed to remove the object at the uppermost priority: <Motion picture file CCC, Display from 5 min. to 10 min., Coordinates (10, 10), Size 20×40, Priority 1, Translucent>, and analyze objects: <Still picture file AAA, Display from 5 min. to 10 min., Coordinates (10, 10), Size 20×30, Priority 2> and <Text file BBB, Display from 7 min. to 9 min., Coordinates (10, 10), Size 10×10, Priority 3> by the analysis contents 1. In this case, the process: “1-1. If the uppermost object is not transparent nor translucent, then selecting Process 1 for other objects” is attached. As a result, the deleting process is performed to the object, <Text file BBB, Display from 7 min. to 9 min., Coordinates (10, 10), Size 10×10, Priority 3>, as the contents of Process 1: “Deleting object” is applied.

The optimization analysis: “2. Plural objects superimpose in time and spatially, and the uppermost object is translucent or transparent”, is adapted to two remaining objects: <Still picture file AAA, Display from 5 min. to 10 min., Coordinates (10, 10), Size 20×30, Priority 2> and <Motion picture file CCC, Display from 5 min. to 10 min., Coordinates (10, 10), Size 20×40, Priority 1, Translucent>. When adapting 2-1 or 2-2, the result is that “2-1. A motion picture is contained”. Then, “Process 2 for all objects over the entire superimposing time” is performed. The contents of Process 2 are “Making a new moving picture from a moving picture consisting of plural superimposed objects”. Then, it is executed to make a new motion picture file XXX by superimposing the translucent motion picture file CCC and the still picture file AAA and make a new object: <Motion picture file XXX, Display from 5 min. to 10 min., Coordinates (10, 10), Size 20×40, Priority 1>.

In spite of the whole display time, <Display all of the following data from 0 to 10 min.> in the input data file with the reference numeral 81, <Motion picture file DDD, Display from 15 min. to 20 min., Coordinates (0, 0), Size 10×10, Priority 1> is included in the described objects. Therefore, this object is adaptive for the analysis contents: “N. A display time for an object is set completely excessive the whole display time range”, and corresponds to “Process 1 for that object”. Here, the contents of Process 1 is “Deleting object”. Therefore, the object: <Motion picture file DDD, Display from 15 min. to 20 min., Coordinates (0, 0), Size 10×10, Priority 1> is deleted.

The optimization is finished at the time of exhausting objects to be applied to the optimization analysis contents during the flow as above, and the optimized data is saved in optimized data saving means 7 (A6). The optimized data has such contents as shown in FIG. 11 with reference numeral 91:

<Start of data>;

<Motion picture file XXX, Display from 5 min. to 10 min., Coordinates (10. 10), Size 10×10, Priority 1>

<End of data>

Next, if it is determined to display the data at the display determination (YES at step A7), the process advances to the display process (step A8). As shown in FIG. 7, since the user selection is the optimization display, the process advances from step B1 to step B2, and it is executed to select the optimized data with reference numeral 91 saved within optimized data saving means 7. Then, data analysis means 3 analyzes the data (at step B4), and output means 4 displays the analyzed data (at step B5).

Next, if it is determined to save the data at the saving determination (YES at step A9), the process advances to the saving process (step A10). Since the user selection is not to perform the optimization saving (saving the optimized data) as shown in FIG. 7, the process advances from step C1 to step C2, and it is executed to read out the input data from input data saving means 2 to store the input data to storage means 10.

As described above, it is realized that the input data is subjected to automatically optimized display instead of direct display and to direct store. The selections of the input data and the optimized data for display and storing can be freely combined.

FIG. 12 is a block diagram showing the second structure of an apparatus for converting object display description document for performing the method for converting object display description document as explained above. The second structure is of a computer.

The second structure comprises CPU 1001, main memory 1002, external storage 1003, input device 1004, interface 1005, display 1006, printer 1007, and bus 1000 connecting CPU 1001, main memory 1002, external storage 1003, input device 1004, interface 1005, display 1006 and printer 1007.

CPU 1001 is such as a microprocessor, a microcomputer, and a DSP. Main memory 1002 is such as a RAM. External storage 1003 is such as a hard drive, an optical disc, and a magnetooptical disc. Input device 1004 is such as a computer mouse, a keyboard, and a data tablet. Interface 1005 is such as a communication interface connected to a Web server through the Internet. The display is such as a CRT, and a LCD.

CPU 1001 functions as data analysis means 3, optimization analysis means 5, and optimization means 6 by executing instructions in a computer program. The computer program has been stored in external storage 1003, and is temporarily loaded in main memory 1002 to be fetched from CPU 1001 at the execution time. Main memory 1002 not only temporally stores the computer program but also functions as input data saving means 2, and optimized data saving means 7. External storage 1003 not only stores the computer program but also functions as storage means 10. Input device 1004 functions as user input means 9. Interface 1005 functions as data input means 1. Display 1006 functions as output means 4 and selection means 8. Printer 1007 functions as output means 4.

As described above, according to the present invention, the following effects are achieved.

A first effect is as below. If the data obtained by the user contains unnecessary data to output or display to the last for the user, the viewer or browser is not required to analyze or display the unnecessary data. Therefore, extra load is not produced and an appropriate display output speed can be achieved. In addition, the obtained data can be stored without the unnecessary data in an appropriate capacity if required. This is because an optimization routine in consideration of time synchronization and spatial synchronization is integrated in the client such as the viewer/browser. Therefore, all data given from the server are not directly output to display at the viewer or browser. In stead, the optimization, for example, deleting is performed to the objects unnecessary to display over time to the last and the objects hidden behind other object. Thus, the optimized data can be displayed or stored.

A second effect is that a motion-picture (object) group consisting of plural superimposed motion-pictures (objects) sent from the server can be displayed at a comfortable speed and a storing capacity can also be reduced. This is because the viewer/browser performs an optimization to make one object from plural objects including motion pictures in consideration of time synchronization and spatial synchronization before display or saving. Thus, a high-speed display and a small data capacity for storage can be achieved.

Having described the embodiments consistent with the present invention, other embodiments and variations consistent with the present invention will be apparent to those skilled in the art. Therefore, the invention should not be viewed as limited to the disclosed embodiments but rather should be viewed as limited only by the spirit and scope of the appended claims.

Nishiura, Sachiko

Patent Priority Assignee Title
Patent Priority Assignee Title
4594673, Jun 28 1983 REDIFUSSION COMPUTER GRAPHICS, INC , A CORP OF DE Hidden surface processor
4924414, Sep 24 1986 Daikin Industries, Ltd. Apparatus and method for obtaining priority numbers for drawing figures forming a display figure
5084830, Oct 26 1987 AMERICAN VIDEO GRAPHICS, L P Method and apparatus for hidden surface removal
5086496, May 02 1988 Arch Development Corporation Method for hidden line and surface removal in a three dimensional display
5088054, May 09 1988 Computer graphics hidden surface removal system
5125074, Aug 10 1988 Thomson-CSF Process for eliminating hidden faces for synthesis of a three-dimensional wire image
5386505, Nov 15 1990 International Business Machines Corporation Selective control of window related overlays and underlays
5388192, Jan 30 1991 Dainippon Screen Mfg. Co., Ltd. Image layout processing method and apparatus
5448686, Jan 02 1992 International Business Machines Corporation; INTERNATIONAL BUSINESS MACHINES CORPORATION A CORPORATION OF NEW YORK Multi-resolution graphic representation employing at least one simplified model for interactive visualization applications
5463728, Mar 10 1993 AT&T Corp. Electronic circuits for the graphical display of overlapping windows with transparency
5559950, Feb 02 1994 IGT Graphics processor enhancement unit
5574835, Apr 06 1993 Apple Inc Bounding box and projections detection of hidden polygons in three-dimensional spatial databases
5583542, May 26 1992 Apple Inc Method for deleting objects on a computer display
5600831, Feb 28 1994 Alcatel Lucent Apparatus and methods for retrieving information by modifying query plan based on description of information sources
5606651, Jun 30 1994 International Business Machines Corporation Process for merging CAD vector files and automatically removing duplicate and obsolete patterns
5615322, Aug 26 1992 Namco Bandai Games INC Image synthesizing system for producing three-dimensional images
5640496, Feb 04 1991 MEDICAL INSTRUMENTATION AND DIAGNOSTICS CORPORATION MIDCO Method and apparatus for management of image data by linked lists of pixel values
5692117, Nov 30 1990 Cambridge Animation Systems Limited Method and apparatus for producing animated drawings and in-between drawings
5768578, Feb 28 1994 Alcatel Lucent User interface for information retrieval system
5802211, Dec 30 1994 Harris Corporation Method and apparatus for transmitting and utilizing analog encoded information
5864342, Aug 04 1995 Microsoft Technology Licensing, LLC Method and system for rendering graphical objects to image chunks
5926185, May 03 1996 Barco Graphics Method for processing a set of page description language commands to reduce complexity
5990904, Aug 04 1995 Microsoft Technology Licensing, LLC Method and system for merging pixel fragments in a graphics rendering system
6052125, Jan 07 1998 Rockwell Collins Simulation And Training Solutions LLC Method for reducing the rendering load for high depth complexity scenes on a computer graphics display
6069633, Sep 18 1997 Meta Platforms, Inc Sprite engine
6125323, Apr 28 1996 AISIN AW CO , LTD Device for processing road data or intersection data
6144388, Mar 06 1998 Process for displaying articles of clothing on an image of a person
6144972, Jan 31 1996 Mitsubishi Denki Kabushiki Kaisha Moving image anchoring apparatus which estimates the movement of an anchor based on the movement of the object with which the anchor is associated utilizing a pattern matching technique
6191797, May 22 1996 Canon Kabushiki Kaisha Expression tree optimization for processing obscured graphical objects
6215503, May 29 1998 Microsoft Technology Licensing, LLC Image generator and method for resolving non-binary cyclic occlusions with image compositing operations
6266064, May 29 1998 Microsoft Technology Licensing, LLC Coherent visibility sorting and occlusion cycle detection for dynamic aggregate geometry
6266068, Mar 13 1998 HEWLETT-PACKARD DEVELOPMENT COMPANY, L P Multi-layer image-based rendering for video synthesis
6317128, Apr 18 1996 AUTODESK CANADA CO Graphical user interface with anti-interference outlines for enhanced variably-transparent applications
6320580, Mar 24 1998 Sega Enterprises, Ltd. Image processing apparatus
6356281, Sep 22 1994 TWITTER, INC Method and apparatus for displaying translucent overlapping graphical objects on a computer monitor
6421736, Dec 27 1995 International Business Machines Corporation Method and system for migrating an object between a split status and a merged status
6456285, May 06 1998 Microsoft Technology Licensing, LLC Occlusion culling for complex transparent scenes in computer generated graphics
EP737930,
JP10171730,
JP7244571,
JP9167124,
WO9523371,
///
Executed onAssignorAssigneeConveyanceFrameReelDoc
Dec 01 1999NISHIURA, SACHIKONEC CorporationASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0104470143 pdf
Dec 06 1999NEC Corporation(assignment on the face of the patent)
Sep 01 2011NEC CorporationNEC PERSONAL COMPUTERS, LTDASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0271540735 pdf
Date Maintenance Fee Events
Jul 24 2013M1551: Payment of Maintenance Fee, 4th Year, Large Entity.
Oct 09 2017REM: Maintenance Fee Reminder Mailed.
Mar 26 2018EXP: Patent Expired for Failure to Pay Maintenance Fees.


Date Maintenance Schedule
Feb 23 20134 years fee payment window open
Aug 23 20136 months grace period start (w surcharge)
Feb 23 2014patent expiry (for year 4)
Feb 23 20162 years to revive unintentionally abandoned end. (for year 4)
Feb 23 20178 years fee payment window open
Aug 23 20176 months grace period start (w surcharge)
Feb 23 2018patent expiry (for year 8)
Feb 23 20202 years to revive unintentionally abandoned end. (for year 8)
Feb 23 202112 years fee payment window open
Aug 23 20216 months grace period start (w surcharge)
Feb 23 2022patent expiry (for year 12)
Feb 23 20242 years to revive unintentionally abandoned end. (for year 12)