A method and device for the pictorial representation of space-related data, for example, geographical data of the earth. Such methods are used for visualising topographical or meteorological data in the form of weather maps or weather forecast films. Further fields of application are found in tourism, in traffic control, in navigation aids and also in studio technology. The space-related data, for example topography, actual cloud distribution, configurations of roads, rivers or frontiers, satellite images, actual temperatures, historical views, CAD-models, actual camera shots, are called up, stored or generated in a spatially distributed fashion. For a screen representation of a view of the object according to a field of view of a virtual observer, the required data are called up and shown only in the resolution required for each individual section of the image. The sub-division of the image into sections with different spatial resolutions is preferably effected according to the method of a binary or quadrant tree.
|
1. A method of providing a pictorial representation of space-related data of a selectable object, the representation corresponding to the a view of the object by an observer with a selectable location and a selectable direction of view comprising:
(a) providing a plurality of spatially distributed data sources for storing space-related data;
(b) determining a field of view including the an area of the object to be represented through the a selection of the a distance of the observer to the object and the an angle of view of the observer to the object;
(c) requesting data for the field of view from at least one of the plurality of spatially distributed data sources;
(d) centrally storing the data for the field of view;
(e) representing the data for the field of view in a pictorial representation having one or more sections;
(f) using a computer, dividing each of the one or more sections having image resolutions below a desired image resolution into a plurality of smaller sections, requesting higher resolution space related data for each of the smaller sections from at least one of the plurality of spatially distributed data sources, centrally storing the higher resolution space related data, and representing the data for the field of view in a the pictorial representation; and
(g) repeating step (f), dividing the sections into smaller sections, until every section has the desired image resolution or no higher image resolution data is available.
2. The method of pictorial representation defined in
3. The method of pictorial representation defined in
4. The method of pictorial representation defined in
5. The method of pictorial representation defined in
6. The method of pictorial representation defined in
7. The method of pictorial representation defined in
8. The method of pictorial representation defined in
9. The method of pictorial representation defined in
10. The method of pictorial representation defined in
11. The method of pictorial representation defined in
12. The method of pictorial representation defined in
13. The method of pictorial representation defined in
14. The method of pictorial representation defined in
15. The method of pictorial representation defined in
16. The method of pictorial representation defined in
17. The method of pictorial representation defined in
18. The method of pictorial representation defined in
19. The method of pictorial representation defined in
20. The method of pictorial representation defined in
21. The method of pictorial representation defined in
22. The method of pictorial representation defined in
23. The method of pictorial representation defined in
24. The method of pictorial representation defined in
25. The method of pictorial representation defined in
26. The method of pictorial representation defined in
28. The method of pictorial representation defined in
29. The method of pictorial representation defined in
30. The method of pictorial representation defined in
31. The method of pictorial representation defined in
32. The method of pictorial representation defined in
33. The method of pictorial representation defined in
34. The method of pictorial representation defined in
35. The method of pictorial representation defined in
36. The method of pictorial representation defined in
37. The method of pictorial representation defined in
38. The method of pictorial representation defined in
39. The method of pictorial representation defined in
40. The method of pictorial representation defined in
41. The method of pictorial representation defined in
42. The method of pictorial representation defined in
|
The invention relates to a method and a device for pictorial representation of space-related data, particularly geographical data of flat or physical objects. Such methods are used for example for visualising topographic or meteorological data in the form of weather maps or weather forecast films. Further fields of application arise from tourism, in traffic control, as navigation aids and in studio technology.
Representations of geographical information are generated according to prior art by using a so-called paintbox. The latter generates from given geographical information maps of a desired area, which are then selectably altered, and for example can be coloured or emphasised according to states, or even represented in an altered projection.
Another system for generating views of a topography is found in the known flight , 00,000 texturised triangles per second and consequently is suitable for rapid picture build-up. It operates with floating-point views with a 32 bit representation. As this accuracy in the present example is insufficient for example to follow a movement of an observer from space continuously down to a centimetre resolution on the earth, the co-ordinates of the data during such a movement were continuously converted to a new co-ordinate system with a coordinate origin located in the vicinity of the observer.
The geographical data required for the image are called up and transmitted via the collecting network 6 from the spatially distributed memories 4. The spatially distributed memories are preferably located in the vicinity of the areas on the earth whose data they contain. In this way the data are detected, stored and serviced at the point where a knowledge of the properties to be represented by the data, such for example as topography, political or social information, etc. is most precise. Further data sources are located at the points where further data are detected or assembled, such for example as meteorological research stations which collect and process information received from satellites.
A characteristic feature of the data flow in the collector network 6 is that the data flow is in one direction. The Internet or ISDN lines were used for this network.
The interchange network 7 serves to interchange data between individual nodes. By means of close-meshed connection of the individual nodes, the network can be secured against the failure of individual conduits or against load peaks. As the interchange network 7 must guarantee a high transmission rate in both directions, a permanent connection was used here with an asynchronous transmission protocol with a transmission rate which is greater than 35 MBit/s. Satellite connections are also suitable for the interchange network 7.
In the supply network 8, substantially the image data required for representation are transmitted to the display device 5. Consequently a high data transmission rate of up to 2 MBit/s is required in the direction of the display unit, which is enabled by intrinsic asynchronous connections or by bundling ISDN connections.
In this embodiment given by way of example, a two-dimensional polygon grid model is used to display the data, which serves as a two-dimensional co-ordinate system for positioning the data. There were used as data to be displayed, for example satellite images, i.e. information relating to the colouring of the earth surface or geopolitical data or actual or stored meteorological data. Images of the same point on the earth surface were shown at different points in time, so that a type of “time journey” could be produced.
Data in tabular form, such for example as temperature information, were masked in as display tables into the view. For certain areas, CAD-models of buildings were available, which were inserted into the view. Then the location of the observer could be displaced at will in these CAD-modelled buildings.
Via position-fixing systems, symbols, for example for ships, aircraft or motor vehicles, in their instantaneous geographical positions, can be inserted into this system and/or animated.
There was used, as a model for sub-dividing the field of view into sections and of these sections into further sections, a quadrant tree in which a progressive sub-division of an area into respectively four sections is carried out.
After selection of the earth as an object and input of a location and a direction of view in the final device 5, the node 3 determines the field of view of the observer and calls up the data via the interchange network 7 and the nodes 1 and 2. These nodes in turn call up, via the collecting network 6, from the spatially distributed data sources 4 or for example from the camera 9, the required data and transmit them over the interchange network 7 to the node 3 for central storage. The node 3 determines the representation of the data centrally stored therein and sends this transmission for viewing over the supply network 8 to the display device 5.
If the node 3 then ascertains that the required screen resolution has not been achieved with the centrally stored data, it divides the field of view according to the model of the quadrant tree into four sections and checks each section to see whether, by representation of the data contained in the sections, the required image resolution has been achieved. If the required image resolution is not achieved, the node 3 calls up further data for this section. This method is repeated for each section until the required image resolution is achieved in the entire view. Call-up of the data is effected in this example always with the same resolution of 128×128 points. Due to the sub-division of a section into four respective sub-sections, therefore, in each data transmission data are loaded which have a spatial accuracy four times higher.
By virtue of the fact that the data are centrally stored in sections only in the accuracy required for image resolution, the number of centrally stored data depends substantially only on the desired image resolution.
If for example one is located approximately 1,000 m above the earth surface, the field of view has an extent of approximately 50 km×50 km. The image resolution in this case should be greater than 3,000×3,000 image points. In order to show the field of view with this image resolution a height value is required every 150 m and an image value of a surface every 15 m. From this there arises a central storage requirement of approximately 35.6 MBytes, in order to store all the required information for showing the image.
If however one is located in space and has the northern hemisphere fully in field of view, then there is required for a representation with the same image resolution a height value every 50 km and an image value of the surface every 5 km. In all there arises a central storage requirement of 39.2 MBytes, which lies in the same order of magnitude as the storage requirement for representation of the view of the earth surface from a height of 1,000 m in the section 50 km×50 km.
An advantage in this type of address formation is further that each section of the object to be represented has a fixed address which to a great extent simplifies the search for the associated data.
Schmidt, Axel, Mayer, Pavel, Sauter, Joachim, Grüneis, Gerold
Patent | Priority | Assignee | Title |
10540804, | Apr 22 2014 | GOOGLE LLC | Selecting time-distributed panoramic images for display |
11163813, | Apr 22 2014 | GOOGLE LLC | Providing a thumbnail image that follows a main image |
11860923, | Apr 22 2014 | GOOGLE LLC | Providing a thumbnail image that follows a main image |
8737683, | Aug 28 2008 | GOOGLE LLC | Architectures and methods for creating and representing time-dependent imagery |
8872847, | Aug 28 2008 | GOOGLE LLC | Architectures and methods for creating and representing time-dependent imagery |
9099057, | Aug 28 2008 | GOOGLE LLC | Architectures and methods for creating and representing time-dependent imagery |
9542723, | Aug 28 2008 | GOOGLE LLC | Architectures and methods for creating and representing time-dependent imagery |
9916070, | Aug 28 2008 | GOOGLE LLC | Architectures and methods for creating and representing time-dependent imagery |
9934222, | Apr 22 2014 | GOOGLE LLC | Providing a thumbnail image that follows a main image |
9972121, | Apr 22 2014 | GOOGLE LLC | Selecting time-distributed panoramic images for display |
D780210, | Apr 22 2014 | GOOGLE LLC | Display screen with graphical user interface or portion thereof |
D780211, | Apr 22 2014 | GOOGLE LLC | Display screen with graphical user interface or portion thereof |
D780777, | Apr 22 2014 | GOOGLE LLC | Display screen with graphical user interface or portion thereof |
D780794, | Apr 22 2014 | GOOGLE LLC | Display screen with graphical user interface or portion thereof |
D780795, | Apr 22 2014 | GOOGLE LLC | Display screen with graphical user interface or portion thereof |
D780796, | Apr 22 2014 | GOOGLE LLC | Display screen with graphical user interface or portion thereof |
D780797, | Apr 22 2014 | GOOGLE LLC | Display screen with graphical user interface or portion thereof |
D781317, | Apr 22 2014 | GOOGLE LLC | Display screen with graphical user interface or portion thereof |
D781318, | Apr 22 2014 | GOOGLE LLC | Display screen with graphical user interface or portion thereof |
D781337, | Apr 22 2014 | GOOGLE LLC | Display screen with graphical user interface or portion thereof |
D791811, | Apr 22 2014 | GOOGLE LLC | Display screen with graphical user interface or portion thereof |
D791813, | Apr 22 2014 | GOOGLE LLC | Display screen with graphical user interface or portion thereof |
D792460, | Apr 22 2014 | GOOGLE LLC | Display screen with graphical user interface or portion thereof |
D829737, | Apr 22 2014 | GOOGLE LLC | Display screen with graphical user interface or portion thereof |
D830399, | Apr 22 2014 | GOOGLE LLC | Display screen with graphical user interface or portion thereof |
D830407, | Apr 22 2014 | GOOGLE LLC | Display screen with graphical user interface or portion thereof |
D835147, | Apr 22 2014 | GOOGLE LLC | Display screen with graphical user interface or portion thereof |
D868092, | Apr 22 2014 | GOOGLE LLC | Display screen with graphical user interface or portion thereof |
D868093, | Apr 22 2014 | GOOGLE LLC | Display screen with graphical user interface or portion thereof |
D877765, | Apr 22 2014 | GOOGLE LLC | Display screen with graphical user interface or portion thereof |
D933691, | Apr 22 2014 | GOOGLE LLC | Display screen with graphical user interface or portion thereof |
D934281, | Apr 22 2014 | GOOGLE LLC | Display screen with graphical user interface or portion thereof |
ER2770, | |||
ER2897, | |||
ER399, | |||
ER6403, | |||
ER6495, | |||
ER96, |
Patent | Priority | Assignee | Title |
4672444, | Nov 14 1985 | RCA Corporation | Method for transmitting a high-resolution image over a narrow-band communication channel |
4847788, | Mar 01 1985 | Hitachi, Ltd. | Graphic data processing method and system |
4876597, | Sep 04 1987 | CCTC INTERNATIONAL, INC | Video observation systems |
5602564, | Nov 14 1991 | Hitachi, Ltd. | Graphic data processing system |
5949551, | Apr 25 1997 | Eastman Kodak Company | Image handling method using different image resolutions |
5953506, | Dec 17 1996 | Oracle International Corporation | Method and apparatus that provides a scalable media delivery system |
6490525, | Jun 04 1996 | BARON SERVICES, INC | Systems and methods for distributing real-time site-specific weather information |
6493633, | Jun 04 1996 | BARON SERVICES, INC | Systems and methods for distributing real-time site specific weather information |
6525732, | Feb 17 2000 | Wisconsin Alumni Research Foundation | Network-based viewing of images of three-dimensional objects |
6937210, | Nov 06 2002 | U S DEPARTMENT OF COMMERCE | Projecting images on a sphere |
DE3639026, | |||
DE4209936, | |||
EP587443, | |||
EP684585, | |||
EP780800, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Dec 31 2007 | Art + Com AG | (assignment on the face of the patent) | / | |||
Oct 27 2010 | ART+COM AG | INNOVATIONPOOL GMGH | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 029853 | /0243 | |
Nov 10 2011 | INNOVATIONPOOL GMBH | ART+COM INNOVATIONPOOL GMBH | CHANGE OF NAME SEE DOCUMENT FOR DETAILS | 029854 | /0283 |
Date | Maintenance Fee Events |
Dec 14 2011 | M2553: Payment of Maintenance Fee, 12th Yr, Small Entity. |
Date | Maintenance Schedule |
Jul 13 2013 | 4 years fee payment window open |
Jan 13 2014 | 6 months grace period start (w surcharge) |
Jul 13 2014 | patent expiry (for year 4) |
Jul 13 2016 | 2 years to revive unintentionally abandoned end. (for year 4) |
Jul 13 2017 | 8 years fee payment window open |
Jan 13 2018 | 6 months grace period start (w surcharge) |
Jul 13 2018 | patent expiry (for year 8) |
Jul 13 2020 | 2 years to revive unintentionally abandoned end. (for year 8) |
Jul 13 2021 | 12 years fee payment window open |
Jan 13 2022 | 6 months grace period start (w surcharge) |
Jul 13 2022 | patent expiry (for year 12) |
Jul 13 2024 | 2 years to revive unintentionally abandoned end. (for year 12) |