A computer-implemented method for creating an image that depicts shadowing for a specified light source even though the input data is not three-dimensional and is limited to elevation data that associates an elevation value with each of a plurality of spatial coordinates. Plumb line walls are generated between elevation points of neighboring grid cells for each elevation point meeting a specified delta elevation criterion. A shadow map is accumulated based on visibility of each pixel to the light source position, and then, in a subsequent pass through the coordinate pixels of the data, an image is created in a tangible medium with each pixel correspondingly visible or shadowed, either totally or partially. values along one dimension may be spread over a Z-buffer range to optimally resolve visibility features.
|
1. A computer-implemented method for creating an image in a tangible medium of a physical scene, the image including a plurality of pixels, the method comprising:
a. receiving elevation point data in which an elevation value is associated with each of a plurality of spatial coordinates corresponding to elevation points, each elevation point centered within a distinct grid cell;
b. determining a light source position specified in 3D space;
c. transforming the elevation point data to a cartesian coordinate system;
d. executing computer program instructions to generate plumb line walls between elevation points of neighboring distinct grid cells for each elevation point meeting a specified delta elevation criterion;
e. building a modelview matrix, in a memory device within a computer, based on the specified light source position;
f. accumulating a shadow map based on visibility of each pixel to the light source position;
g. employing a projection matrix for transforming the shadow map to a viewer frame of reference;
h. in a subsequent pass through pixels within a specified view volume, processing each pixel as a visible pixel or an occluded pixel on a basis of the shadow map; and
i. creating the image in the tangible medium with each pixel correspondingly visible or shadowed.
12. A computer program product for use on a computer system for creating an image in a tangible medium of a physical scene, the image including a plurality of pixels, the computer program product comprising a non-transitory computer-readable medium on which are stored computer instructions such that, when executed by a processor, the instructions cause the processor to:
a. receive elevation point data in which an elevation value is associated with each of a plurality of spatial coordinates corresponding to elevation points, each elevation point centered within a distinct grid cell;
b. determine a light source position specified in 3D space;
c. transform the elevation point data to a cartesian coordinate system;
d. generate plumb line walls from elevation points of distinct grid cells to a z=0 plane for each elevation point meeting a specified delta elevation criterion;
e. build a modelview matrix based on the specified light source position;
f. accumulate a shadow map based on visibility of each pixel to the light source position;
g. transform the shadow map to a viewer frame of reference; and
h. in a subsequent pass through pixels within a specified view volume, process each pixel as a visible pixel or an occluded pixel on a basis of the shadow map; and
i. create the image in the tangible medium with each pixel correspondingly visible or shadowed.
2. A computer-implemented method according to
3. A computer-implemented method according to
4. A computer-implemented method according to
5. A computer-implemented method according to
6. A computer-implemented method according to
7. A computer-implemented method according to
8. A computer-implemented method according to
9. A computer-implemented method according to
10. A computer-implemented method according to
11. A computer-implemented method according to
13. A computer program product according to
14. A computer program product according to
|
The present invention relates to the generation of shadows in images, and, more particularly, to methods and apparatus for performing shadowing on the basis of very large 2.5-dimensional elevation data sets.
Shadow mapping is a technique, generally traced back to the paper of Williams, “Casting curved shadows on curved surfaces,” SIGGRAPH '78 Proc. 5th Annual Conf. on Computer Graphics, pp. 270-74 (1978) (incorporated herein by reference), used to determine where shadows lie in 3D computer graphics on the basis of knowledge of the source and direction of light illuminating a 3D scene. In traditional shadow mapping, each pixel is tested as to whether it is visible from the light source and thus illuminated by it, and, if not, the pixel is designated as to be shadowed.
As described, shadow mapping is based upon 3D data. However, in the context of geographic information systems (GIS), 3D data may not be available. Indeed, it would be highly desirable to provide the capability to estimate shadows cast by any illumination source, either indoors or outdoors, where the available data might be surface elevation, LIDAR, point cloud, or partial surface data, and where full side wall data may not be available. In particular, data relevant to side walls of man-made structures, or to naturally occurring slopes in hilly or mountainous terrain, are largely absent from elevation data.
For purposes of illustration,
Moreover, in addition to the absence of full 3D geometric data, another daunting feature of GIS data sets is the very large quantity of data that must be processed, often in a near real-time mode.
In accordance with an embodiment of the present invention, a computer-implemented method is provided for creating an image in a tangible medium of a physical scene. The computer-implemented method has steps of:
receiving elevation point data in which an elevation value is associated with each of a plurality of spatial coordinates;
determining a light source position specified in 3D space;
transforming the elevation point data to a Cartesian coordinate system;
generating plumb line walls between elevation points of neighboring grid cells for each elevation point meeting a specified delta elevation criterion;
building a modelview matrix based on the specified light source position;
accumulating a shadow map based on visibility of each pixel to the light source position;
employing a projection matrix for transforming the shadow map to a viewer frame of reference;
and, in a subsequent pass through pixels within a specified view volume,
processing each pixel as a visible pixel or an occluded pixel on a basis of the shadow map; and
creating the image in the tangible medium with each pixel correspondingly visible or shadowed.
In accordance with alternate embodiments of the present invention, the elevation point data may be formatted in a grid of image tiles. The elevation point data may be provided in geographic coordinates.
In other embodiments of the invention, the method may have a further step of calculating solar insolation on the basis of the shadow map. The shadow map, like the input data, may be formatted in a grid of image tiles.
In further embodiments, determination of the light position may be based on a specified geographic location and a specified time. Accumulating the shadow map may include testing points in a Z buffer, as well as calculation in a raster coordinate system, and may be based in part on material properties of illuminated surfaces. Accumulating the shadow map may include tracing rays of illumination, and may additionally account for ambient light.
In accordance with another aspect of the present invention, a computer program product is provided for use on a computer system for creating an image in a tangible medium of a physical scene, wherein the image includes multiple pixels. The computer program product includes a non-transitory computer-readable medium on which are stored computer instructions such that, when executed by a processor, the instructions cause the processor to:
In other embodiment of the invention, the computer program product may have instructions that cause the processor to format the shadow map in a grid of image tiles. The instructions may also cause the processor to optimize a Z depth spread of a raster coordinate system.
The invention will be more fully understood by referring to the following Detailed Description of Specific Embodiments in conjunction with the Drawings, of which:
The term “image” shall refer to any multidimensional representation, whether in tangible or otherwise perceptible form, or otherwise, wherein a value of some characteristic (amplitude, phase, etc.) is associated with each of a plurality of locations (or, vectors in a Euclidean space, typically 2) corresponding to dimensional coordinates of an object in physical space, though not necessarily mapped one-to-one thereonto. Thus, for example, the graphic display of the spatial distribution of some field, either scalar or vectorial, such as brightness or color, or intensity of a generated second harmonic, constitutes an image. So, also, does an array of numbers, such as a 3D holographic dataset, in a computer memory or holographic medium. Similarly, “imaging” refers to the rendering of a stated physical characteristic in terms of one or more images.
The term “shadow map,” used here synonymously with the term “occlusion map,” refers to an image in which pixels are tagged as visible, or not, as viewed from a locus in space denoted as the “viewer's position.”
The term “point of illumination” may refer either to a point, in a rigorous geometrical sense, or to a “patch,” in that it refers to an area, defined in three-dimensional space, that is compact, in a rigorous mathematical sense, from which light is assumed to emanate—typically, but not necessarily, in parallel rays—such as to illuminate a scene.
A “distant light source direction” refers to a direction from which light emanating from a source of illumination may be assumed to propagate with a planar phase front, i.e., with all rays traveling in parallel toward the illuminated scene.
A “modelview matrix” shall refer to a transformation from object coordinates to coordinates in a frame of reference based on a position and direction specified in 3D space, such as a point of illumination or a point of view.
A “projection matrix” shall refer to a transformation from object coordinates to coordinates in a subspace, which is to say, a space of lower dimensionality, such as a two-dimensional image of a three-dimensional scene.
The term “time” is used herein in the most general sense possible, and includes date and astronomical epoch, or whatever other parameters are used to indicate time in a desired frame of reference.
The term “elevation data,” or, equivalently, “2.5D data,” are data that provide elevations corresponding to specific point, or coordinates, on a surface, such as the surface of the Earth, for example. Elevation data may be collected in any number of manners, including remote sensing methods such as LIDAR, or other aerial survey methods, or by terrestrial surveys, again, by way of example only.
In accordance with embodiments of the present invention, two sorts of inputs are used to create a shadow map: elevation data sets, in any of a large variety of possible formats, and immediate-user-defined inputs. The immediate-user-defined inputs may include such parameters as the date and time (allowing for calculation of the position of the Sun, criteria defining an elevation-change delta (as discussed below) for generating a plumb line, and the spacing density of inferred points on generated plumb lines. These are examples of immediate-user-defined inputs, provided here as examples, and without limitation.
Many methods for creating shadow maps may be practiced in accordance with the prior art. The present invention pertains to various novel methods that are now described and claimed herein. Referring, first, to
As an overview, a method of shadow estimation, designated generally by numeral 100, is now described with reference to
The elevation data that are input in step 102 may be supplied in any format. For example, the elevation data may be formatted into a regular grid of image tiles, which may be provided in one or more separate data files. While elevation data may be supplied in any of a variety of coordinate systems, the data are then transformed to Cartesian coordinates (104) using standard coordinate transformation techniques. Within the scope of the present invention, transformation to Cartesian coordinates may be performed in one or more steps. For example, the elevation data may first be converted to latitude/longitude format, and then to Cartesian coordinates.
Input data may also include the time and date for which shadow casting data are desired. Within the scope of the present invention, the time may be provided in whatever time frame is desired. For example, a terrestrial time zone may be specified, or else a time zone may be extracted based on the geographic location of the input data. If the data pertain to another planet, for example, the time may be designated using any convention for specification of time.
Typically, the data will be handled as gridded into a raster array, with a point of elevation data centered within each grid cell. It is to be understood, however, that implementation of embodiments of the present invention does not rest on any particular gridding of the data. Once the data points have been recast in raster space, the values of vectors in raster representation (RasterX, rasterY, rasterZ) will typically assume values within the following ranges: RasterX: [0, Full Virtual Raster Width]; RasterY: [0, Full Virtual Raster Width]; and RasterZ: [0,1]. Optimal spreading of Z values over the Z raster range is an optional aspect of the present invention, and will be discussed in detail below.
The light source position is then determined (106) relative to the scene to be shadow-mapped. Any method for specifying the light source position is within the scope of the present invention. One example is that of determining the position of the Sun relative to a terrestrial scene in order to compute insolation based on a time specified by the user. Determining the position of the Sun (or any other astronomical source) relative to a specified position on Earth at a specified time is an astronomical procedure well-known to persons of ordinary skill in astronomy and needs no further description here. In various shadow mapping applications, however, the Sun may be treated as a distant light source, which is to say that the illumination phase front is flat, i.e., that light rays 28 (shown in
In accordance with embodiments of the present invention, a shadow map is generated during two passes through the elevation points in the input data. During the first pass through the data, plumb lines are generated from upper to adjacent lower elevations (108) on the basis of criteria that may be determined by the immediate user. Typically, the immediate user will specify an elevation change delta, defined such that if elevation points in adjacent cells 12 and 13 (shown in
At this juncture, the Z buffer depth range may be spread (110), as further described in detail below. Adjusting the effective position of light source 20 allows the visibility test, performed in conjunction with accumulation of a shadow map, to be performed more accurately, by computing as large a spread between non-adjacent points as is viable. In essence, the computed rasterZ values are spread over the entire closed [0,1] rasterZ range.
One or more data transformation matrices, including a modelview transformation, are calculated (112) based on the light source position, so that shadowcasting tests may be performed, in accordance with any of the 3D shadow casting algorithms that are known in the art. A shadow visibility map (otherwise referred to herein as an “occlusion map” or “shadow occlusion map”) is the accumulated using standard techniques, or otherwise, for determining whether a given pixel is exposed to source 20, or whether an inferred wall 30 intervenes such that a given pixel 35 is obscured, as shown in
Once all relevant pixels have been tagged as visible or occluded, a second pass through the data is performed. A projection transformation is applied (116) accounting for the viewing point with respect to the scene that is being mapped. Then, on the basis of the pixel markings as to occlusion, with the pixels suitably transformed to the frame of the viewer, a shadow image is created (118), coloring points based on their visibility for line-of-sight or viewshed analysis, for example, using any of the shadow image creation techniques known in the art. More than a single source 15 of illumination may be considered, within the scope of the present invention, as may ambient lighting. Alternatively, the occlusion data may be applied, within the scope of the present invention, for other computational purposes, such as the computation of solar insolation, for example.
Further detail is now provided with respect to various embodiments of the method 100 of shadow estimation depicted schematically in the flowchart of
As a preliminary step, it is preferred to compute a list of model volume boundary points based on the range of data in the initial model, as now described with reference to
Once plumb lines 22 have been dropped and walls 30 inferred so as to create effective 3D data for shadowing, as herein taught for the first time, standard 3D graphics transformations may then be applied and a shadow map may be accumulated. The shadow or occlusion map is based on the 3D graphics concept of a Z buffer or depth map. In this transformation multiple points may transform to the same raster coordinate. At the end of a pass through the data, only one of the potentially multiple pixels will be visible. The visible pixel is the one that transforms to a Z depth or value closest to the viewpoint. The Z depth closest to the viewpoint will be the smallest Z value computed for the corresponding x, y raster coordinates.
A poor utilization of the Z range leads to poor shadow visibility test results, as it may be very difficult to tell which raster point is visible or in shadow and which point is not. With a proper spread of the raster Z values, two Z raster values lying within a small distance of each other may be considered both visible. But, when the majority of the Z values are clumped within a very small range, it is difficult to determine which points are visible and which are not even when a tolerance is used.
In accordance with the present invention, Z values are spread in an iterative binary search for the best limits and best scale factor that may be obtained in a specified number of iterations. Any search algorithm may be employed to optimize the Z spread, within the scope of the present invention. A preferred method for optimizing the Z spread is now described, with reference to
In a SetupModelParameters module, the raster data transformation matrices are initially set up. The light source position is determined (601) using the user specified date and time, for example, and the central model position of the input data. This computation establishes the distant light source direction to create the raster transformation matrices for the algorithm. For geographic data, for example, the central model position is calculated as the center surface point of the input 2.5D data.
In a ComputeZDistanceScaleFactor module (603), a Z distance scale is computed and a resulting eye-to-target distance. This allows for computation of the raster transformation matrices used in the visibility algorithm. In particular the Modelview Matrix and Projection Matrix are built, and the view volume is calculated (605) and tested for whether it is contained within, and optimally fills, the [0,1] range. The process may then be iterated to specified limits.
The above described methods may be performed, and the above described systems may be implemented, by a computer system, including a processor, by executing appropriate instructions stored in a memory. Apparatus for creating a shadow map of structures and terrain have been described as including a processor controlled by instructions stored in a memory. The memory may be random access memory (RAM), read-only memory (ROM), flash memory or any other memory, or combination thereof, suitable for storing control software or other instructions and data. Some of the functions performed by the transformation methods and apparatus have been described with reference to flowcharts and/or block diagrams. Those skilled in the art should readily appreciate that functions, operations, decisions, etc. of all or a portion of each block, or a combination of blocks, of the flowcharts or block diagrams may be implemented as computer program instructions, software, hardware, firmware or combinations thereof. Those skilled in the art should also readily appreciate that instructions or programs defining the functions of the present invention may be delivered to a processor in many forms, including, but not limited to, information permanently stored on tangible non-transitory non-writable storage media (e.g. read-only memory devices within a computer, such as ROM, or devices readable by a computer I/O attachment, such as CD-ROM or DVD disks), information alterably stored on tangible non-transitory writable storage media (e.g. floppy disks, removable flash memory and hard drives) or information conveyed to a computer through communication media, including wired or wireless computer networks. In addition, while the invention may be embodied in software, the functions necessary to implement the invention may optionally or alternatively be embodied in part or in whole using firmware and/or hardware components, such as combinatorial logic, Application Specific Integrated Circuits (ASICs), Field-Programmable Gate Arrays (FPGAs) or other hardware or some combination of hardware, software and/or firmware components.
While the invention is described through the above-described exemplary embodiments, it will be understood by those of ordinary skill in the art that modifications to, and variations of, the illustrated embodiments may be made without departing from the inventive concepts disclosed herein. For example, although some aspects of the shadow estimation method have been described with reference to a flowchart, those skilled in the art should readily appreciate that functions, operations, decisions, etc. of all or a portion of each block, or a combination of blocks, of the flowchart may be combined, separated into separate operations or performed in other orders. Moreover, while the embodiments are described in connection with various illustrative data structures, one skilled in the art will recognize that the system may be embodied using a variety of data structures. Furthermore, disclosed aspects, or portions of these aspects, may be combined in ways not listed above. Accordingly, the invention should not be viewed as being limited to the disclosed embodiments.
Patent | Priority | Assignee | Title |
11132479, | Dec 29 2017 | Augmented reality system for component assembly and archival baseline clone | |
11486697, | Dec 29 2017 | Optical structural health monitoring |
Patent | Priority | Assignee | Title |
6664962, | Aug 23 2000 | NINTENDO CO , LTD | Shadow mapping in a low cost graphics system |
8291345, | Oct 05 2007 | Autodesk, Inc.; AUTODESK, Inc | Sun-shadow simulation in a geospatial system |
20030023412, | |||
20100283853, | |||
20100295851, | |||
20120035887, | |||
EP2234069, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Nov 11 2014 | Intergraph Corporation | (assignment on the face of the patent) | / | |||
Nov 19 2014 | ACREE, ELAINE S | Intergraph Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 034215 | /0167 |
Date | Maintenance Fee Events |
Aug 17 2020 | REM: Maintenance Fee Reminder Mailed. |
Feb 01 2021 | EXP: Patent Expired for Failure to Pay Maintenance Fees. |
Date | Maintenance Schedule |
Dec 27 2019 | 4 years fee payment window open |
Jun 27 2020 | 6 months grace period start (w surcharge) |
Dec 27 2020 | patent expiry (for year 4) |
Dec 27 2022 | 2 years to revive unintentionally abandoned end. (for year 4) |
Dec 27 2023 | 8 years fee payment window open |
Jun 27 2024 | 6 months grace period start (w surcharge) |
Dec 27 2024 | patent expiry (for year 8) |
Dec 27 2026 | 2 years to revive unintentionally abandoned end. (for year 8) |
Dec 27 2027 | 12 years fee payment window open |
Jun 27 2028 | 6 months grace period start (w surcharge) |
Dec 27 2028 | patent expiry (for year 12) |
Dec 27 2030 | 2 years to revive unintentionally abandoned end. (for year 12) |