Disclosed are embodiments of in-situ display monitoring and calibration systems and methods. An image acquisition system captures images of the viewing plane of the display. Captured images may then be processed to characterize various visual performance characteristics of the display. When not in use capturing images of the display, the image acquisition system can be stored in a manner that protects it from environmental hazards such as dust, dirt, precipitation, direct sunlight, etc. A calibration image in which a plurality of light emitting elements is set to a particular color and intensity may be displayed, an image then captured, and then a difference between what was expected and what was captured may be developed for each light emitting element. Differences between captured images and expected images may be used to create a calibration data set which then may be used to adjust the display of further images upon the display.
|
8. A display calibration method comprising:
displaying, by a display control subsystem, a calibration pattern on a viewing plane, the viewing plane comprising a plurality of light emitting elements arranged in a predetermined pattern, the display control subsystem comprising, for each of the plurality of light emitting elements, a first data set defining a surface normal vector and a second data set defining an incidence vector;
capturing a first image of the viewing plane, the first image having a first position and a first orientation;
capturing a second image of the viewing plane, the second image having a second position and a second orientation;
normalizing the first image and the second image using a plurality of data sets indicative of an expected image;
determining one or more differences in characteristics between: (i) the normalized first image and the expected image, or (ii) the normalized second image and the expected image; and
applying adjustments to the viewing plane to correct the one or more differences.
1. A display system comprising:
a plurality of light emitting elements arranged in a predetermined pattern, the plurality of light emitting elements forming a viewing plane;
a display control subsystem configured to display a calibration pattern on the viewing plane and comprising, for each of the plurality of light emitting elements, a first data set defining a surface normal vector and a second data set defining an incidence vector, the display control subsystem storing instructions that, when executed, cause the display system to:
capture a first image of the viewing plane, the first image having a first position and a first orientation;
capture a second image of the viewing plane, the second image having a second position and a second orientation;
normalize the first image and the second image using a plurality of data sets indicative of an expected image;
determine one or more differences in characteristics between: (i) the normalized first image and the expected image, or (ii) the normalized second image and the expected image; and
apply adjustments to the viewing plane to correct the one or more differences.
15. A display calibration system, comprising:
a plurality of light emitting elements arranged in a predetermined pattern, the plurality of light emitting elements forming a viewing plane;
a first image acquisition subsystem having a plurality of light receiving elements forming a first imaging plane, the first image acquisition subsystem configured to capture a first image of the viewing plane from a first position and a first orientation;
a second image acquisition subsystem having a plurality of light receiving elements forming a second imaging plane, the second image acquisition subsystem configured to capture a second image of the viewing plane from a second position and a second orientation;
a linking coupled to the first image acquisition subsystem, the linking having (i) a deployed position corresponding to the first position and the first orientation, and (ii) a stored position to protect the first imaging plane; and
a display control subsystem configured to display a calibration pattern on the viewing plane, wherein the display control subsystem stores instructions that, when executed, cause the display calibration system to:
capture the first image of the viewing plane;
capture the second image of the viewing plane;
normalize the first image and the second image using a plurality of data sets indicative of an expected image;
determine one or more differences in characteristics between: (i) the normalized first image and the expected image, or (ii) the normalized second image and the expected image; and
apply adjustments to the viewing plane to correct the one or more differences.
2. The display system of
3. The display system of
4. The display system of
5. The display system of
the surface normal vector is originating at each light emitting element and directed perpendicular to the viewing plane;
each incidence vector is originating at each light emitting element and directed at the first position or the second position; and
the display control subsystem further comprises a third data set defining, for each of the plurality of light emitting elements, a first luminous output corresponding to the surface normal vector, a second luminous output corresponding to the incidence vector, and a known relationship to link the first luminous output with the second luminous output.
6. The display system of
7. The display system of
9. The display calibration method of
10. The display calibration method of
transitioning the first image acquisition subsystem from a stored position to a deployed position corresponding to the first position and the first orientation.
11. The display calibration method of
12. The display calibration method of
the surface normal vector is originating at each light emitting element and directed perpendicular to the viewing plane;
each incidence vector is originating at each light emitting element and directed at the first position or the second position; and
the display control subsystem further comprises a third data set defining, for each of the plurality of light emitting elements, a first luminous output corresponding to the surface normal vector, a second luminous output corresponding to the incidence vector, and a known relationship to link the first luminous output with the second luminous output.
13. The display calibration method of
14. The display calibration method of
16. The display calibration system of
a first data set defining a surface normal vector for each of the plurality of light emitting elements, the surface normal vector originating at each light emitting element and directed perpendicular to the viewing plane;
a second data set defining an incidence vector for each of the plurality of light emitting elements, each incidence vector originating at each light emitting element and directed at the first position or the second position; and
a third data set defining, for each of the plurality of light emitting elements, a first luminous output corresponding to the surface normal vector, a second luminous output corresponding to the incidence vector, and a known relationship to link the first luminous output with the second luminous output.
17. The display calibration system of
18. The display calibration system of
prior to capturing the first image, transition, by the linking, the first image acquisition subsystem from the stored position to the deployed position.
19. The display calibration system of
|
This non-provisional utility application claims the benefit of and is a continuation of application Ser. No. 16/983,283, filed Aug. 3, 2020 and entitled “In-Situ Display Monitoring and Calibration System and Methods”. Application Ser. No. 16/983,283 claimed the benefit of and is a continuation of application Ser. No. 16/245,792, filed Jan. 11, 2019 and entitled “In-Situ Display Monitoring and Calibration System and Methods”. Application Ser. No. 16/245,792 claimed the benefit of and is a continuation of application Ser. No. 15/459,089, filed Mar. 15, 2017 and entitled “In-Situ Display Monitoring and Calibration System and Methods”. Application Ser. No. 15/459,089 claimed the benefit of prior filed provisional application No. 62/309,739, filed Mar. 17, 2016 and entitled “In-Situ Display Monitoring and Calibration System and Methods”. Application Ser. Nos. 16/983,283, 16/245,792, 15/459089 and 62/309739 are herein incorporated by reference.
The sense of sight is utterly compelling to those human beings who possess it. The adage that a picture is worth a thousand words resonates with an appreciation of the profound importance of taking in visual information. The sense of sight is unique in allowing us to absorb so much information from our world so quickly. It is natural then that advertisers, entertainers, artists, and others all want to engage people with their own visual content for the purpose creating a desired response in their intended audience. A large-scale visual display system is a particularly compelling way for people to experience the presentation of visual information and such systems are the focus of the present disclosure.
There are numerous features of a visual display system that contribute to its impact upon viewers including: size, brightness, contrast, color saturation, color depth, display refresh rate, resolution, pixel pitch, pixel pitch uniformity, and others.
There are numerous other features of a visual display system that are of interest to the owners and operators of such systems including: ease of installation, ease of service, reliability, ease of configuration, ease of maintenance, ease of operation, cost of the system, cost of installation, cost of operation, cost of service, and others.
Display systems with large screen sizes present a number of difficult problems that are in need of solution. One significant challenge for display owners and operators is to maintain the visual performance of a large display once it has been installed in a viewing location. The visual performance of a display can be characterized using a number of measures including uniformity of brightness across the entire display, uniformity of color across the entire display, contrast ratio, color temperature and uniformity of color temperature across the entire display, color fidelity to predetermined standards, etc.
Large displays may be built from a plurality of individual light emitting elements, arranged in a pre-determined pattern to create a composite viewing plane. Due to variances in manufactured materials and manufacturing processes, it is a fact of life that characteristics of individual light emitting devices vary from one device to the next and that individual devices may respond differently to environmental conditions of voltage, current, temperature, humidity, exposure to sun light, exposure to atmospheric gases such as ozone and nitrogen oxides, and aging. Variations in performance of individual light emitting elements include the characteristics of luminous intensity per light emitting element, luminous intensity produced per unit current, dominant wavelength of emitted light, wavelength distribution of emitted light, temperature coefficient of change of any of the prior parameters. Visual performance of the entire display is therefor subject to change as each and every light emitting element is exposed to the previously listed environmental factors and their variations over time.
In consideration of the foregoing points, it is clear that embodiments of the present disclosure confer numerous advantages and are therefore highly desirable.
The present disclosure is directed to systems and methods for monitoring and calibrating display screens which comprise a plurality of display modules, each module having a plurality of light emitting elements, the plurality of display modules disposed to collectively provide a viewing plane. Other aspects of the disclosure are directed to systems and methods for monitoring and calibrating display screens which comprise a plurality of display modules, each module having a plurality of modulated reflective elements disposed to collectively provide a viewing plane.
Display systems of the present disclosure comprise a plurality of light emitting elements coupled to a substrate and arranged in a predetermined pattern collectively forming a viewing plane. Other display systems of the present disclosure may comprise a plurality of display modules assembled to make a large, unified, visual display in which each display module comprises a plurality of light emitting elements coupled to a substrate and arranged in a predetermined pattern with respect to a viewing plane. Each display module may be shaped so that it may abut one or more other display modules without introducing gaps or overlaps between adjacent display modules. The display systems disclosed create a highly uniform visual effect by creating highly uniform spacing between light emitting elements, both within a single display module and across a plurality of display modules when the plurality are assembled into a large, unified, visual display.
The present disclosure provides systems and methods of monitoring and calibration of displays that may be in indoor or outdoor locations. Basic system features include: an electro-optical image acquisition system; a moveable linkage coupled to the image acquisition system, the linkage having at least two defined positions: a deployed position in which the image acquisition system is disposed and directed so that the viewing plane of the display is imageable by the image acquisition system; and a stored position in which the image acquisition system is protected from the environment. The image acquisition system has spatial resolution and sensitivity to both color and luminous flux sufficient to enable monitoring and calibration operations using data acquired by the image acquisition system.
Each time the moveable linkage is moved into the deployed position, the image acquisition system is disposed in the same position and orientation with respect to the position and orientation of the viewing plane of the display. Each time the moveable linkage is moved into the stored position, the image acquisition system is disposed to protect it from the environment, direct sun exposure, precipitation, etc.
Raw data acquired by the image acquisition system may be processed to provide normalized image data from the display. In turn, normalized image data may be processed to provide calibration data for use in calibrating the display's visual performance according to desired characteristics. In turn, calibration data may be processed by the display to affect, adjust, or perfect the visual performance of the display.
Acquisition of raw data by the image acquisition system may occur autonomously or under the control of a remote agent. Autonomous acquisition may occur according to a pre-established calendar or schedule. Autonomous acquisition may furthermore be condition responsive with respect to ambient lighting, temperature, time-of-day, or weather conditions. For example, it may be advantageous to delay acquisition of raw data if it is raining or snowing outside. Condition responsive acquisition may delay or otherwise schedule acquisition until local conditions are more suitable.
Raw data may be acquired from the image acquisition system and processed to form calibration data, which can then be acted upon locally to accomplish an in-situ adjustment to the visual performance of the display. In addition, raw data acquired by the image acquisition system may be transmitted from the display to a remote entity, thereby facilitating processing of the raw data by a remote entity. Calibration data may be computed remotely and then transmitted to the display system, which can then act locally to accomplish an in-situ adjustment to the visual performance of the display.
When in the deployed state, the image acquisition system is disposed with respect to the viewing plane of the display in a definite position and orientation. This establishes a known geometric relationship between the viewing plane and the image acquisition system. Under certain circumstances the previously mentioned geometric relationship may cause undesirable non-uniformities in the raw data. Many of the feasible geometric relationships between viewing plane and image acquisition system result in raw data that captures more light from some regions of the viewing plane and less light from other portions of the display plane.
The known geometric relationship may be combined with the known electro-optical properties of the image acquisition system to form a normalization function that substantially counteracts the undesirable non-uniformities arising from the known geometric relationship. The step of normalizing the acquired raw data may precede the step of forming calibration data. Use of the calibration data may then proceed as before.
A sequence of one or more calibration patterns may be displayed on the viewing plane. Raw data may be captured by the image acquisition system corresponding to the one or more calibration patterns. The captured data may be used singly or jointly by means of one or more statistical combinations of more than one image. Raw data may then be processed to produce calibration data for the display. The calibration data may pertain to a plurality of individual light emitting elements of the display or may apply to regions containing a plurality of light emitting elements. Calibration data may comprise data corresponding to one or more of the following visual performance characteristics of the display: white point, color gamut, color balance, gamma correction, and brightness.
Exemplary Embodiment 1.0—According to an embodiment of the present disclosure, a monitoring system for use with a display having a plurality of light emitting elements arranged in a predetermined pattern collectively forming a viewing plane comprises:
Exemplary Embodiment 1.1—According to another embodiment of the present disclosure, exemplary embodiment 1.0 is further characterized in that: the image acquisition system is triggerable to capture one or more images, each captured image being an image comprising the entire viewing plane; and the deployed position is further characterized in that said image acquisition system is disposed in a predetermined position to capture one or more images of the entire viewing plane.
Exemplary Embodiment 1.2—According to another embodiment of the present disclosure, exemplary embodiment 1.1 is further characterized in that each light emitting element of said viewing plane is uniquely imageable by at least one of said plurality of light receiving elements forming said imaging plane.
Exemplary Embodiment 1.3—According to another embodiment of the present disclosure, exemplary embodiment 1.1 is further characterized in that the display comprises a plurality of display modules, each display module comprising a plurality of light emitting elements arranged in a predetermined pattern, said plurality of display modules collectively forming the viewing plane; the system further characterized in that each display module of said viewing plane is uniquely imageable by at least one of said plurality of light receiving elements forming said imaging plane.
Exemplary Embodiment 2.0—According to another embodiment of the present disclosure, any of exemplary embodiments 1.0, 1.1, or 1.2 may be further characterized in that: a surface normal vector is defined at each of said plurality of light emitting elements perpendicular to said viewing plane; an incidence vector is defined for each of said plurality of light emitting elements starting at each of said plurality of light emitting elements and directed toward the portion of the imaging plane that images each of said light emitting elements, each incidence vector having both a direction and a distance; each light emitting element of the display producing a first luminous output in the direction of said surface normal and a second luminous output in the direction of the incidence vector, said first and second luminous outputs being in a substantially known relationship; the system additionally comprising a data processing means operable to compute a normalized calibration image by applying said substantially known relationship to one or more images captured by said image acquisition system, the normalized calibration image comprising an estimate of said first luminous output for each of said plurality of light emitting elements.
Exemplary Embodiment 2.1—According to another embodiment of the present disclosure, exemplary embodiments 1.0, 1.1 or 1.2 may be further characterized in that: a viewpoint vector is defined at each of said plurality of light emitting elements, each viewpoint vector being directed in the same direction from each of said plurality of light emitting elements; an incidence vector is defined for each of said plurality of light receiving elements starting at each of said plurality of light emitting elements and directed toward the portion of the imaging plane that images each of said light emitting elements, each incidence vector having both a direction and a distance; each light emitting element of the display producing a first luminous output in the direction of said viewpoint vector and a second luminous output in the direction of the incidence vector, said first and second luminous outputs being in a substantially known relationship; the system additionally comprising a data processing means operable to compute a normalized calibration image by applying said substantially known relationship to one or more images captured by said image acquisition system, the normalized calibration image comprising an estimate of said first luminous output for each of said plurality of light emitting elements.
Exemplary Embodiment 2.2—According to another embodiment of the present disclosure, any of exemplary embodiment 1.3 may be further characterized in that: a surface normal vector is defined at each of said plurality of display modules perpendicular to said viewing plane; an incidence vector is defined for each of said plurality of display modules starting at about the centroid of the display plane of each of said plurality of display modules and directed toward the portion of the imaging plane that images each of said display modules, each incidence vector having both a direction and a distance; each display module of the display producing a first luminous output in the direction of said surface normal and a second luminous output in the direction of the incidence vector, said first and second luminous outputs being in a substantially known relationship; the system additionally comprising a data processing means operable to compute a normalized calibration image by applying said substantially known relationship to one or more images captured by said image acquisition system, the normalized calibration image comprising an estimate of said first luminous output for each of said plurality of display modules.
Exemplary Embodiment 2.3—According to another embodiment of the present disclosure, exemplary embodiment 1.3 may be further characterized in that: a viewpoint vector is defined at each of said plurality of display modules, each viewpoint vector being directed in the same direction from each of said plurality of display modules; an incidence vector is defined for each of said plurality of display modules starting at about the centroid of the display plane of each of said plurality of display modules and directed toward the portion of the imaging plane that images each of said display modules, each incidence vector having both a direction and a distance; each display module of the display producing a first luminous output in the direction of said viewpoint vector and a second luminous output in the direction of the incidence vector, said first and second luminous outputs being in a substantially known relationship; the system additionally comprising a processing means operable to compute a normalized calibration image by applying said substantially known relationship to one or more images captured by said image acquisition system, the normalized calibration image comprising an estimate of said first luminous output for each of said plurality of display modules.
Exemplary Embodiment 2.4—According to another embodiment of the present disclosure, exemplary embodiment 2.0, 2.1, 2.2, or 2.3 further characterized in that first and second luminous outputs comprise one or more of the following properties: luminous intensity, wavelength of luminous output.
Exemplary Embodiment 3.0—According to another embodiment of the present disclosure, exemplary embodiments 2.0, 2.1, 2.2, 2.3, or 2.4 further comprising: a display control system operable to render visual data on said display, said display control system responsive to one or more of said normalized calibration images to change the rendering of visual data upon at least a portion of the display.
Exemplary Embodiment 3.1—According to another embodiment of the present disclosure, exemplary embodiment 3.0 in which the change in rendering of visual data upon the display changes one or more of the following visual characteristics of at least a portion of the display: white point, color gamut, color balance, gamma correction, gray-scale rendering, and brightness.
Exemplary Embodiment 3.2—According to another embodiment of the present disclosure, exemplary embodiment 3.0 or 3.1, the display control system further characterized in that the rendering of visual data on the display comprises the steps of:
Exemplary Embodiment 4.0—According to another embodiment of the present disclosure, an in-situ monitoring and calibration system for a display, the display comprising a plurality of light emitting elements collectively creating a viewing plane with a displayed resolution, the system comprising:
receive visual media data comprising brightness and color information for each of a plurality of picture elements at an encoded resolution;
receive calibration data comprising adjustments to brightness and color for each of a plurality of light emitting elements comprising said display;
transform said visual media data using said calibration data thereby forming a set of visual data at said display resolution, said set of visual data corresponding to said visual media data;
display said set of visual data upon said the viewing plane of said display;
display a sequence of one or more calibration images upon said display;
Exemplary Embodiment 5.0—According to another embodiment of the present disclosure, a method for in-situ monitoring and calibration of a display, the display comprising a plurality of light emitting elements collectively creating a viewing plane with a displayed resolution, the method comprising the steps of:
These and other features, aspects, and advantages of the present invention will become better understood with regard to the following description, appended claims, and accompanying drawings where:
2—in-situ display monitoring and calibration system
4—display
6—display control system
8—coordinate system showing x-axis, y-axis, and z-axis
8X—x-axis
8Xa, 8Xb—first x-axis, second x-axis
8Y—y-axis
8Ya, 8Yb—first y-axis, second y-axis
8Z—z-axis
10—square tile, which is a regular 4-sided polygon
10a, 10b, etc.—first square, second square, etc.
11—pitch distance
12—square tiling of the plane
12v—representative vertex of the square tiling
12s—representative side of the square tiling
14—predetermined pattern corresponding to a tiling of the plane
16—rectangular tiling of the plane
20—actuateable linkage
21—Communication network
22—data processing means
23—visual media data
24—calibration pattern
25—expected image
26—calibration data set
27—transformed visual media data
28—visual media rendered on the viewing plane of the display
29—configuration data
30—image acquisition system
30a, 30b—first, second image acquisition system
31—image acquisition system stored position
33—image acquisition system deployed position
34—plurality of light receiving elements
35—relative illuminance
36—imaging plane
38—captured image
40—normalized image
41—normalization function
50—incidence vector
50a, 50b, 50c, . . . —first, second, third, etc. incidence vector
70—display module
70a, 70b, 70c, . . . —first, second, third, etc. display module
71—light emitting element
71a, 71b, etc.—first, second, etc. light emitting element
72—plurality of light emitting elements
72a, 72b, etc.—first light emitting element, second light emitting element, etc.
74—display plane
74a, 74b—first display plane, second display plane
75—display plane disposed at a first angle with respect to the viewing plane
76—display module substrate
78—display assembly
78a, 78b, 78c, etc.—first, second, third, etc. display assembly
80—viewing plane
82—surface normal vector
84—luminous output
84a, 84b, . . . —first, second, etc. luminous output
86—viewpoint vector
86a, 86b, . . . —first, second, etc. viewpoint vector
{i, j, k}—unit vectors in x, y, and z directions, respectively
{x0, y0, z0}—location of the center of the imaging plane in 3 dimensions
{xi, yi, zi}—location of the ith light emitting element in 3 dimensions
{ai, bi, ci}—direction cosines corresponding to the ith incidence vector, the incidence vector having the form: ai*i+bi*j+ci*k
G1, G2, G3, G4—first, second, third, fourth geometric feature of the viewing plane
F1, F2, F3, F4—first, second, third, fourth geometric feature identified in a captured image, in which F1 corresponds to G1, F2 corresponds to G2, F3 corresponds to G3, and F4 corresponds to G4.
r0, r1, r2—first, second, third radial distances from a reference point
A0—area located at a distance of r0
A1—projection of area A0 at distance r1
A2—projection of area A0 at distance r2
200—a process for rendering visual media on a viewing plane
202—process step of receiving, in a display control system, a frame of visual media data
204—process step of transforming a frame of visual media data in a display control system
206—process step of displaying a transformed frame of visual media
300—a process for creating a normalized image
302—process step of positioning in a deployed position with respect to a display, an image acquisition system
304—process step of triggering an image acquisition system to acquire a captured image
306—process step of defining an incidence vector for each of a plurality of light receiving elements
308—process step of associating with each of the light receiving elements comprising the imaging plane a normalization function
310—process step of applying a normalization function to a captured image thereby producing a normalized image
400—a process for calibrating a display
402—process step of displaying a calibration pattern
404—process step of associating an expected image with a calibration pattern
406—process step of triggering an image acquisition system to acquire a captured image of a viewing plane
408—process step of creating a normalized image from a captured image
410—process step of forming a calibration data set comprising the color and brightness differences between an expected image and a normalized image
412—process step of applying, in a display control system, a calibration data set to the rendering of visual media upon a viewing plane of a display such that the differences between a normalized image and an expected image are reduced.
Uniformity in color, brightness, grayscale are fundamental visual performance goals for a large display. Any visual non-uniformity present on the viewing plane of the display is easily noticed by viewers due to the highly refined and discriminating qualities of the human visual system. It often happens that one or more light emitting elements or display modules must be replaced due to damage, aging, or acts of nature. A replacement light emitting element or display module often has a different grayscale, brightness and/or color response as the element or module, respectively, that the light emitting element or display module replaces. In-situ monitoring and calibration of a display is particularly effective for maintaining uniformity in color, brightness, and grayscale across the entire viewing plane of the display, even when replacement of light emitting elements becomes necessary.
In general terms, in-situ display monitoring and calibration uses an image acquisition system to capture images of the viewing plane of the display. Captured images may then be processed to characterize various visual performance characteristics of the display. When not in use capturing images of the display, the image acquisition system can be stored in a manner that protects it from environmental hazards such as dust, dirt, precipitation, direct sunlight, etc. In addition, images may be presented on the display that facilitate the calibration process. For example, a calibration image in which a plurality of light emitting elements is set to a particular color and intensity may be displayed, an image then captured, and then a difference between what was expected and what was captured may be developed for each light emitting element. Differences between captured images and expected images may be used to create a calibration data set which then may be used to adjust the display of further images upon the display.
The visual performance of a display may be referenced from a defined viewpoint, which is essentially a point in 3-dimensional space from which the viewing plane is viewed by a person. The image acquisition system has an imaging plane for capturing images that is generally not at the same location as the viewpoint. A captured image may be post processed to infer what the display looks like at the viewpoint of choice. Each light emitting element of the display has a predetermined position and orientation in space. Each light emitting element produces an outgoing illuminance that varies in both brightness and color depending on the distance to the viewer and on the angle between the viewer and the illuminance pattern produced by the light emitting element. Knowing the distance, angles, and illuminance pattern between a light emitting element and an image plane enables the system to capture images of the viewing plane on the imaging plane and then infer, by computations involving the know distance, angles, and illuminance pattern, what the viewing plane looks like when viewed from the viewpoint. Both monitoring of the display and calibration of the display are thereby enabled by the system and methods of the present disclosure.
To further facilitate the present description, it will be useful now to turn to the construction of a display according to various embodiments of the present disclosure. Tesselation of a planar surface is the tiling of the plane using one or more geometric shapes, called tiles, creating no gaps and no overlaps. A periodic tiling has a repeated geometric pattern. A regular tiling is a tiling in which all tiles are regular polygons having the same size and shape. Square, triangular, and hexagonal tilings are each an example of a regular, periodic tiling that can achieve a tesselation of a planar surface without gaps or overlaps. Tilings are of special interest in the construction of modular displays because their properties enable the construction of large displays with desirable properties. Assembling a plurality of smaller display modules in which each display module is configured to have a size, shape, and orientation corresponding to a predetermined tiling may produce a large display having no gaps and no overlaps between adjacent display modules.
Within a single display module, a plurality of light emitting elements may be arranged in a predetermined pattern derived from an appropriately configured tiling. A planar tiling of regular polygons consists of edges and vertexes. The set of vertexes of a regular polygon tiling can be seen to create a pattern with a high degree of regularity. A highly uniform visual effect may be produced by placing a light emitting element at or about each of the vertexes of a regular polygon tiling.
Light emitting elements of the present disclosure may each comprise a single light emitting device or multiple light emitting devices. A preferred light emitting element combines red, blue, and green light emitting devices within one light emitting element so as to provide full color spectrum display. Monochrome and other combinations of devices may be used still within the spirit and scope of this disclosure. In other embodiments a light emitting element may comprise white, red, blue, and green devices within a single light emitting element. In other embodiments a light emitting element may comprise red, green, blue, and cyan devices. In other embodiments a light emitting element may comprise red, green, blue, yellow, and cyan devices, or any combination of devices emitting at different colors within a single light emitting element. In other embodiments multiple devices emitting at substantially the same color may be used.
In still other embodiments of the present disclosure, light emitting elements may be replaced by light reflective elements. A light reflective element may receive a portion of incoming ambient or directed light and then reflect a portion of the light back to the viewer of a display. Modulating the reflective properties of the light reflective element allows control over the intensity of the reflected light. The portion of incoming ambient or directed light that is not reflected to a viewer may be absorbed, scattered, or otherwise redirected so that it is substantially attenuated with respect to a viewer of the display. A plurality of light reflective elements may be modulated so as to produce images upon a viewing plane. For a light source, a reflective display system may use ambient light, directed non-ambient light, or a combination of both ambient and directed non-ambient light in producing a display.
In creating a uniform visual effect, it is useful to consider a property called pitch distance, which is the distance between any light emitting element and its closest adjacent light emitting elements. It can be seen that a highly uniform visual effect is produced by maintaining a highly uniform pitch throughout a single display module and across a plurality of adjacent display modules. Preferred embodiments of the present disclosure use light emitting elements located at or about the vertexes of a regular polygon tiling. A regular square tiling is one such preferred tiling, producing a uniform visual effect by providing uniform spacing between both rows and columns of light emitting elements. The spacing between adjacent rows and between adjacent columns of a regular square tiling may be referred to as the pitch of that pattern. In such a square tiling, it can be seen that any light emitting element will have at least two closest adjacent neighboring elements that are spaced apart from each other by a distance close to or substantially equal to the pitch distance.
In addition to uniform pitch within a single display module, the spacing between display modules can be controlled so that uniform pitch of light emitting elements is maintained across a plurality of assembled display modules. A preferred embodiment is to provide a display module with a perimeter region, of a predetermined width, that contains no light emitting elements. The preferred width of the perimeter region is less than or about equal to one half of the pitch distance, when measured inward and along the edges of the regular polygon tiling defining the location of the plurality of the light emitting elements. When two display modules are assembled adjacent to one another, each module may provide a perimeter region width of about one half of the pitch, which cumulatively creates a pattern of uniform pitch spanning both modules. A plurality of display modules may thereby be assembled to create uniform pitch spanning the plurality of display modules.
A single display module may comprise a plurality of light emitting elements coupled to a substrate and arranged in a predetermined pattern corresponding to the vertexes of a regular polygon tiling. The display module has a perimeter. A plurality of display modules may be assembled such that a portion of the perimeter of each display module abuts a portion of the perimeter of at least one other display module, each module positioned to maintain uniform pitch spacing across the plurality of display modules.
A display system according to the present disclosure may be constructed by assembling a plurality of display modules onto a support frame, the support frame having been previously constructed.
Turning now to
Turning now to
Turning now to
The deployed position 33 shown in
Image acquisition system 30 is triggerable to capture one or more images when the system is in the deployed position. When triggered, an image may be captured, the image comprising at least a portion of the viewing plane. In preferred embodiments the captured image comprises the entire viewing plane. In other preferred embodiments the image acquisition system may comprise a plurality of imaging planes, each having a known position and orientation when in a deployed position, each operative to capture an image of at least a portion of the viewing plane, the plurality of imaging planes operative to capture, collectively, the entire viewing plane.
Turning now to
The stored position of the image acquisition system may be further characterized in that any electrical and optical components of the image acquisition system contributing to or responsible for capturing images are substantially protected from exposure to environmental contaminants including dust, dirt, moisture, direct sunlight, etc., that may detrimentally affect the operation of the image acquisition system.
Continuing with
A viewpoint may be defined anywhere in three-dimensional space from which the viewing plane is visible. The viewpoint represents a viewer located at that distance looking at the viewing plane. For any given, fixed viewpoint, at each light emitting element a viewpoint vector may be defined originating at the light emitting element and extending to the viewpoint. For any given, fixed viewpoint, each light emitting element may be expected to possess a unique viewpoint vector. It is evident from the geometry that a fixed viewpoint located far away from the viewing plane has the property that each viewpoint vector is essentially parallel to every other viewpoint vector. In
Each light emitting element produces a luminous flux that radiates away from the light emitting element in 3-dimensional space. To facilitate the discussion, a first surface normal vector may be defined that originates at the location of the light emitting element and extends perpendicular to the local curvature of the viewing plane. In addition, a second surface normal vector may be defined originating at a light receiving element comprising the imaging plane and extending perpendicular to the imaging plane. The portion of a light emitting element's luminous flux that is received remotely from the light emitting element by a light receiving element having a given area is inversely proportional to the squared distance between emitter and receiver and is also a function not only of the brightness of the light emitting element but also of the angle between the first surface normal vector and the second surface normal vector. It is evident that for any predetermined position and orientation of the imaging plane, a unique incidence vector may be defined for each light emitting element comprising the viewing plane and that both angle and distance impact the light that is received on the imaging plane by any particular light emitting element.
An index i may be created for enumerating through each light emitting element comprising the viewing plane. Index i may be allowed to take the values from 1 to N, where N is the total number of light emitting elements comprising the display. An incidence vector may therefore be represented as: ai*i+bi*j+ci*k; where {ai, bi, Ci} are direction cosines corresponding to the ith incidence vector, and {i, j, k} are unit vectors in x, y, and z directions, respectively. Furthermore, {Xi, yi, zi} describes location of the ith light emitting element in 3 dimensions, and {x0, y0, z0} describes the location of the center of the imaging plane in 3 dimensions. The distance from any particular light emitting element to the center of the imaging plane can be calculated as: Di=[(xi−x0)2+(yi−y0)2+(zi−z0)2]1/2
Direction cosines {ai, bi, Ci} are accordingly determined by the formulas:
ai(xi−x0)/Di; bi=(yi−y0)/Di; ci=(zi−z0)/Di;
An even more exacting relationship can be described in which a unique coordinate {X0i, y0i, z0i} on the imaging plane is associated with each light emitting element that is imaged. In that case the distance be determined by the formula:
Direction cosines {ai, bi, ci} are then determined by computing:
ai=(xi−x0i)/Di; bi=(yi−y0i)/Di; ci=(zi−z0i)/Di;
Shown now in
It can be understood that the graph in
Turning now to
While the embodiments of
The apparatus of
Turning now to
The display control system 6 of
Embodiments like that disclosed in
Although the present invention has been described in considerable detail with reference to certain preferred versions thereof, other versions are possible. It may be desirable to combine features shown in various embodiments into a single embodiment. A different number and configuration of features may be used to construct embodiments of the apparatus and systems that are entirely within the spirit and scope of the present disclosure. Therefore, the spirit and scope of the appended claims should not be limited to the description of the preferred versions contained herein.
Any element in a claim that does not explicitly state “means for” performing a specified function, or “step for” performing a specific function, is not to be interpreted as a “means” or “step” clause as specified in 35 U.S.C. Section 112, Paragraph 6. In particular, the use of “step of” in the claims herein is not intended to invoke the provisions of 35 U.S.C. Section 112, Paragraph 6.
Cope, Richard C., Heske, III, Theodore
Patent | Priority | Assignee | Title |
Patent | Priority | Assignee | Title |
5747928, | Oct 07 1994 | IOWA STATE UNIVERSITY RESEARCH FOUNDATION, INC | Flexible panel display having thin film transistors driving polymer light-emitting diodes |
6332690, | Oct 22 1997 | Yazaki Corporation | Liquid crystal display with curved liquid crystal screen |
6819045, | Sep 07 2000 | Sony Corporation | Display device |
6974971, | Apr 03 2001 | Koninklijke Philips Electronics N.V. | Matrix array devices with flexible substrates |
7242398, | Feb 18 2002 | IGNIS INNOVATION INC | Flexible display device |
7636085, | Oct 12 2006 | Samsung Electronics Co., Ltd | Flexible display unit mobile terminal having the same |
7710370, | Nov 21 2002 | SAMSUNG ELECTRONICS CO , LTD | Flexible display device rollable between rolled-up and unrolled states |
7714801, | Jan 05 2005 | Nokia Technologies Oy | Foldable electronic device and a flexible display device |
7825582, | Nov 08 2004 | Kyodo Printing Co., Ltd.; Nippon Hoso Kyokai | Flexible display and manufacturing method thereof |
7834537, | May 09 2006 | Samsung Electronics Co., Ltd. | Seamless foldable display device |
7834962, | Dec 19 2007 | Trivale Technologies | Liquid crystal display |
7868545, | Jul 24 2006 | KABUSHIKI KAISHA TOSHIBA, | Display device |
7977170, | Oct 03 2006 | Eastman Kodak Company | Flexible substrate with electronic devices and traces |
7999760, | Jun 29 2005 | Bayerische Motoren Werke Aktiengesellschaft | Method for a distortion-free display |
8023060, | Feb 16 2007 | Industrial Technology Research Institute | Flexible display |
8096068, | Dec 10 2002 | SAMSUNG ELECTRONICS CO , LTD | Display device and electronic appliance for use in combination therewith |
8097812, | Feb 10 2009 | E INK HOLDINGS INC | Flexible pixel array substrate and flexible display |
8098486, | Mar 18 2009 | E INK HOLDING INC | Flexible display device |
8228667, | May 21 2010 | Wistron Corporation | Electronic device capable of providing a display panel with planar support |
8284369, | Aug 20 2008 | Semiconductor Energy Laboratory Co., Ltd. | Flexible light-emitting device, and method for fabricating the same |
8319725, | May 02 2009 | SEMICONDUCTOR ENERGY LABORATORY CO , LTD | Display device |
8334926, | Aug 14 2008 | FotoNation Limited | In-camera based method of detecting defect eye with high accuracy |
8456078, | Jan 31 2007 | Canon Kabushiki Kaisha | Flexible display apparatus |
8471995, | Sep 10 2010 | AU Optronics Corporation | Flexible display panel |
8477464, | Jun 15 2007 | SAMSUNG ELECTRONICS CO , LTD | Electronic device with a variable angulation of a flexible display |
8493520, | Jun 01 2007 | Sharp Kabushiki Kaisha | Optical system and display that converts a flat image to a non-flat image |
8493726, | Apr 25 2007 | SAMSUNG ELECTRONICS CO , LTD | Electronic device comprising a flexible display with edge protectors |
8654519, | Nov 21 2007 | SAMSUNG ELECTRONICS CO , LTD | Electronic device with a flexible display |
8780039, | Nov 28 2008 | Sharp Kabushiki Kaisha | Optical system and display |
8816977, | Mar 21 2011 | Apple Inc.; Apple Inc | Electronic devices with flexible displays |
8873225, | Apr 22 2008 | SAMSUNG ELECTRONICS CO , LTD | Electronic apparatus with a flexible display having a body enabling further functionality |
8963895, | Sep 22 2011 | NANOLUMENS ACQUISITION, INC | Ubiquitously mountable image display system |
8982545, | Mar 04 2010 | TOVIS CO , LTD | Curved-surface display panel fabrication method, curved-surface display panel using same, and multi-image display device using same |
9013367, | Jan 04 2008 | Canadian Imperial Bank of Commerce | Flexible display |
9058755, | Jan 04 2008 | Canadian Imperial Bank of Commerce | Lightweight unitary display |
9071809, | Jan 04 2008 | Canadian Imperial Bank of Commerce | Mobile, personsize display system and method of use |
9117384, | Mar 18 2011 | Malikie Innovations Limited | System and method for bendable display |
9159707, | Jan 04 2008 | NanoLumens Acquisition, Inc.; NANOLUMENS AQUISITION, INC | Flexible display |
9176535, | Jun 03 2011 | Microsoft Technology Licensing, LLC | Flexible display flexure assembly |
9279573, | Feb 09 2015 | NANOLUMENS ACQUISITION, INC ; NANOLUMENS AQUISITION, INC | Front serviceable mounting apparatus and methods |
9286812, | Jun 07 2011 | Microsoft Technology Licensing, LLC | Flexible display extendable assembly |
9326620, | Mar 12 2015 | NANOLUMENS ACQUISITION, INC ; NANOLUMENS AQUISITION, INC | Modular display system and methods |
9330589, | Nov 16 2011 | Canadian Imperial Bank of Commerce | Systems for facilitating virtual presence |
9335793, | Jan 31 2011 | Apple Inc.; Apple Inc | Cover attachment with flexible display |
9372508, | Jul 21 2010 | E Ink Holdings Inc.; E INK HOLDINGS INC | Flexible display device and method for the same capable of reducing accidental contact-induced malfunctions |
9404644, | Feb 09 2015 | NanoLumens Acquisition, Inc. | Front serviceable mounting apparatus and methods |
9435518, | Oct 28 2008 | NanoLumens Acquisition, Inc. | Lightweight unitary display |
9445044, | Nov 16 2011 | NANOLUMENS AQUISITION, INC ; NANOLUMENS ACQUISITION, INC | Methods for facilitating virtual presence |
9459656, | Oct 12 2008 | SAMSUNG ELECTRONICS CO , LTD | Flexible devices and related methods of use |
9535649, | Jan 04 2008 | NanoLumens Acquisition, Inc. | Mobile, personsize display system and method of use |
20020190921, | |||
20060098153, | |||
20060204675, | |||
20070241002, | |||
20080042940, | |||
20080218369, | |||
20090189917, | |||
20100245396, | |||
20110134144, | |||
20120002360, | |||
20120092363, | |||
20120098976, | |||
20120313862, | |||
20130100392, | |||
20140194683, | |||
20160150223, | |||
20180077354, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Mar 17 2017 | HESKE, THEODORE, III | NANOLUMENS ACQUISITION, INC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 058560 | /0706 | |
Apr 13 2017 | COPE, RICHARD C | NANOLUMENS ACQUISITION, INC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 058560 | /0706 | |
Dec 17 2021 | NanoLumens Acquisition, Inc. | (assignment on the face of the patent) | / |
Date | Maintenance Fee Events |
Dec 17 2021 | BIG: Entity status set to Undiscounted (note the period is included in the code). |
Jan 03 2022 | SMAL: Entity status set to Small. |
Date | Maintenance Schedule |
Oct 03 2026 | 4 years fee payment window open |
Apr 03 2027 | 6 months grace period start (w surcharge) |
Oct 03 2027 | patent expiry (for year 4) |
Oct 03 2029 | 2 years to revive unintentionally abandoned end. (for year 4) |
Oct 03 2030 | 8 years fee payment window open |
Apr 03 2031 | 6 months grace period start (w surcharge) |
Oct 03 2031 | patent expiry (for year 8) |
Oct 03 2033 | 2 years to revive unintentionally abandoned end. (for year 8) |
Oct 03 2034 | 12 years fee payment window open |
Apr 03 2035 | 6 months grace period start (w surcharge) |
Oct 03 2035 | patent expiry (for year 12) |
Oct 03 2037 | 2 years to revive unintentionally abandoned end. (for year 12) |