An image obtaining section of an information processing apparatus obtains polarized light images in a plurality of directions from an imaging device provided to a moving body. A polarization degree obtaining section of an image analyzing section obtains a degree of polarization on the basis of direction dependence of luminance of polarized light. A high polarization degree region evaluating section detects the peripheral edge of a play field on which the moving body is present, by evaluating the shape of a high polarization degree region having degrees of polarization higher than a threshold value in a state in which the high polarization degree region is overlooked. An output data generating section generates and outputs output data for avoiding movement to the peripheral edge.
|
7. A play field deviation detecting method performed by an information processing apparatus, the method comprising:
obtaining data of polarized light images in a plurality of directions, the polarized light images being photographed by an imaging device provided to a moving body;
obtaining distribution of degrees of polarization by using the polarized light images;
detecting a peripheral edge of a play field on which the moving body is present, by evaluating a shape of a high polarization degree region having degrees of polarization higher than a threshold value in a state in which the high polarization degree region is overlooked from directly above the moving body, where the detecting includes detecting the peripheral edge of the play field by identifying a missing part in an arc shape representing the high polarization degree region and corresponding to an angle of view; and
generating and outputting output data for avoiding movement to the peripheral edge.
8. A non-transitory, computer readable storage medium containing a computer program, which when executed by a computer, causes the computer to perform a play field deviation detecting method by carrying out actions, comprising:
obtaining data of polarized light images in a plurality of directions, the polarized light images being photographed by an imaging device provided to a moving body;
obtaining distribution of degrees of polarization by using the polarized light images;
detecting a peripheral edge of a play field on which the moving body is present, by evaluating a shape of a high polarization degree region having higher degrees of polarization than a threshold value in a state in which the high polarization degree region is overlooked from directly above the moving body, where the detecting includes detecting the peripheral edge of the play field by identifying a missing part in an arc shape representing the high polarization degree region and corresponding to an angle of view; and
generating and outputting output data for avoiding movement to the peripheral edge.
1. An information processing apparatus comprising:
an image obtaining section configured to obtain data of polarized light images in a plurality of directions, the polarized light images being photographed by an imaging device provided to a moving body;
a polarization degree obtaining section configured to obtain distribution of degrees of polarization by using the polarized light images;
a high polarization degree region evaluating section configured to detect a peripheral edge of a play field on which the moving body is present, by evaluating a shape of a high polarization degree region having degrees of polarization higher than a threshold value in a state in which the high polarization degree region is overlooked from directly above the moving body, where the high polarization degree region evaluating section detects the peripheral edge of the play field by identifying a missing part in an arc shape representing the high polarization degree region and corresponding to an angle of view; and
an output data generating section configured to generate and output output data for avoiding movement to the peripheral edge.
2. The information processing apparatus according to
3. The information processing apparatus according to
4. The information processing apparatus according to
5. The information processing apparatus according to
6. The information processing apparatus according to
|
The present invention relates to an information processing apparatus and a play field deviation detecting method that detect deviation of a moving body from a play field beforehand.
Known are technologies in which a head-mounted display is mounted on a head to play a game while watching a displayed screen or view electronic content (see PTL 1, for example). Of these technologies, also spreading is a technology which realizes augmented reality or virtual reality by providing an imaging device in front of the head-mounted display and replacing a part of an image photographed in a field of view corresponding to the direction of the face of a user with a virtual object or representing a virtual world in the corresponding field of view. In addition, also known is a technology which makes a robot move safely or perform a target operation flexibly by providing the robot with an imaging device and making the robot recognize objects present in surroundings, and thereby creating an environment map.
In a state of wearing the head-mounted display, the user cannot see the outside. Thus, the user may lose a sense of direction or move to an unexpected position in a real space because the user is too immersed in the game. Such conditions are difficult for the user to recognize unless the user takes off the head-mounted display. In addition, also in the case where the robot is provided with an imaging device, the robot may move to an unexpected direction due to erroneous recognition of surrounding objects and conditions. In either case, there is a danger that the user or the robot collides with a wall or an obstacle or makes a false step at a level difference of the ground.
The present invention has been made in view of such problems. It is an object of the present invention to provide a technology that can suitably control the moving range of a moving body such as a robot or a user wearing a head-mounted display.
An aspect of the present invention relates to an information processing apparatus. This information processing apparatus includes an image obtaining section configured to obtain data of polarized light images in a plurality of directions, the polarized light images being photographed by an imaging device provided to a moving body; a polarization degree obtaining section configured to obtain distribution of degrees of polarization by using the polarized light images; a high polarization degree region evaluating section configured to detect a peripheral edge of a play field on which the moving body is present, by evaluating a shape of a high polarization degree region having degrees of polarization higher than a threshold value in a state in which the high polarization degree region is overlooked from directly above the moving body; and an output data generating section configured to generate and output output data for avoiding movement to the peripheral edge.
Yet another aspect of the present invention relates to a play field deviation detecting method. This play field deviation detecting method performed by an information processing apparatus includes a step of obtaining data of polarized light images in a plurality of directions, the polarized light images being photographed by an imaging device provided to a moving body; a step of obtaining distribution of degrees of polarization by using the polarized light images; a step of detecting a peripheral edge of a play field on which the moving body is present, by evaluating a shape of a high polarization degree region having degrees of polarization higher than a threshold value in a state in which the high polarization degree region is overlooked from directly above the moving body; and a step of generating and outputting output data for avoiding movement to the peripheral edge.
It is to be noted that any combinations of the above constituent elements and modes obtained by converting expressions of the present invention between a method, a device, and the like are also effective as modes of the present invention.
According to the present invention, it is possible to suitably control the moving range of a moving body such as a robot or a user wearing a head-mounted display.
On the other hand, there may also be a mode in which, instead of the robot 4, the user wears a head-mounted display including the imaging device 12. In this case, the information processing apparatus 10 may be included in the head-mounted display, or may be an external device capable of communicating with the head-mounted display by radio or wire. In this case, the information processing apparatus 10 may generate a display image corresponding to a line of sight of the user, on the basis of the photographed image, and display the display image on the head-mounted display. At least a part of the display image may be an image photographed by the imaging device 12.
Hence, information to be output from the information processing apparatus 10 may be a control signal to the robot 4, or may be an image to be displayed on the head-mounted display, an audio signal to be output, or the like. The information to be output from the information processing apparatus 10 can change depending on the mode to be implemented. The present embodiment aims at safely moving a moving body such as the robot 4 or the user within a predetermined play field 14. Hereinafter, the moving body will be described mainly as the robot 4. However, the robot 4 can be replaced with the user wearing the head-mounted display.
The play field 14 represents a movable range in which the robot 4 can move safely. Basically, set as the play field 14 is a planar region in which a slope or a level difference where the robot 4 may lose balance and fall or a height difference where the robot 4 may drop is not present. A table, a floor, a ground, and the like may be the play field 14, and the peripheral edge of the play field 14 is defined by a boundary at which the planar shape disappears, such as an edge of the table, a wall or a staircase continuous with the floor, or a pit in or an inclination of the ground. Hence, the shape and size of the play field 14 are not limited.
The information processing apparatus 10 detects that the robot 4 is about to deviate from the play field 14 by movement, by using a polarized light image photographed by the imaging device 12. Then, when the robot 4 is about to deviate, the information processing apparatus 10 controls the robot 4 so as to avoid the deviation. For this purpose, the information processing apparatus 10 obtains distribution of degrees of polarization on a moving surface by using polarized light images in a plurality of directions which images are photographed by the imaging device 12. Then, whether or not the peripheral edge of the play field 14 is present in the vicinity is determined on the basis of the shape of a region in which degrees of polarization higher than a predetermined threshold value are obtained (for example, a region 16).
The linear polarizing plate 70 transmits only linearly polarized light vibrating in a certain direction in the light ray 82. The vibration direction of the transmitted polarized light will hereinafter be referred to as a transmission axis of the linear polarizing plate 70. The transmission axis can be set in any direction when the linear polarizing plate 70 is rotated about an axis perpendicular to a surface thereof. If light arriving at the imaging device 12 is unpolarized light, observed luminance is constant even when the linear polarizing plate 70 is rotated. On the other hand, because ordinary reflected light is partially polarized light, the observed luminance changes with respect to the direction of the transmission axis. In addition, a manner in which the luminance changes differs depending on a ratio between specular reflection and diffuse reflection and an angle of incidence.
That is, specularly reflected light includes a high proportion of s-polarized light vibrating in a direction perpendicular to the incidence plane 76, and diffusely reflected light includes a high proportion of p-polarized light vibrating in a direction parallel with the incidence plane 76. However, each proportion depends on the angle of incident light (or emitted light) at the observation point “a.” In any event, when the specularly reflected light is dominant, the observed luminance is at a maximum in a state in which the transmission axis of the linear polarizing plate 70 is perpendicular to the incidence plane 76, and the observed luminance is at a minimum in a state in which the transmission axis is parallel with the incidence plane.
When the diffusely reflected light is dominant, the observed luminance is at a maximum in a state in which the transmission axis of the linear polarizing plate 70 is parallel with the incidence plane, and the observed luminance is at a minimum in a state in which the transmission axis is perpendicular to the incidence plane. Hence, changes in polarized light luminance at the image forming point “b” which changes are obtained by photographing polarized light images in various directions of the transmission axis include information regarding the angle of the incidence plane 76 and the angle of the incident light (or the emitted light), and in turn include information regarding the normal vector n. Here, an angle θ formed between the normal vector n and the light ray 82 is referred to as a zenith angle of the normal vector n.
In view of the fact that the s-polarized light is vibration perpendicular to the incidence plane and the p-polarized light is vibration parallel with the incidence plane, a polarization direction (ψs−90°) in which the luminance is at a minimum in the specular reflection or the polarization direction ψd in which the luminance is at a maximum in the diffuse reflection represents the angle of the incidence plane. The normal vector n is always included in the incidence plane. Thus, the angle represents the angle of a vector obtained by projecting the normal vector n onto the photographed image plane. This angle is generally referred to as an azimuth angle of the normal vector n. The normal vector in a three-dimensional space as viewed from the imaging device 12 is uniquely determined by obtaining the zenith angle described above in addition to the azimuth angle. A polarization direction when the luminance of the observed polarized light is at a maximum is referred to as a phase angle ψ. Changes in the luminance I illustrated in
By approximating the luminance observed in a plurality of polarization directions y while the linear polarizing plate 70 is rotated in the form of Equation 1 with use of a least-square method or the like, Imax, Imin, and ψ can be obtained. A degree of polarization ρ is obtained by the following equation using Imax and Imin among Imax, Imin, and ψ.
Here, η is the index of refraction of an object. The zenith angle θ is obtained by assigning the degree of polarization ρ obtained by Equation 2 to one of ρs and ρd in Equation 3. A normal vector (px, py, pz) is obtained as follows on the basis of the azimuth angle α and the zenith angle θ thus obtained.
Incidentally, in the present embodiment, means for observing the polarized light luminance is not limited to the linear polarizing plate. For example, a layer of polarizers may be provided as a part of an imaging element structure.
The wire grid polarizer layer 114 includes polarizers in which a plurality of linear conductor members are arranged in a stripe manner at intervals smaller than the wavelength of the incident light. When light condensed by the microlens layer 112 enters the wire grid polarizer layer 114, a polarized light component in a direction parallel with polarizer lines is reflected, and only a polarized light component perpendicular to the polarizer lines is transmitted. A polarized light image is obtained by detecting the transmitted polarized light component by the light detecting layer 118. The light detecting layer 118 has a semiconductor element structure of an ordinary CCD (Charge Coupled Device) image sensor, an ordinary CMOS (Complementary Metal Oxide Semiconductor) image sensor, or the like.
The wire grid polarizer layer 114 includes an arrangement of polarizers such that transmitting polarization directions are different in charge reading units in the light detecting layer 118, i.e., pixel units, or larger units. A right side of the figure illustrates a polarizer arrangement 120 when the wire grid polarizer layer 114 is viewed from an upper surface. Shaded lines in the figure are conductors (wires) constituting the polarizers. Incidentally, rectangles of dotted lines each represent a region of a polarizer in one direction, and the dotted lines themselves are not actually formed.
In the illustrated example, polarizers in four directions are arranged in four regions 122a, 122b, 122c, and 122d in two rows and two columns. In the figure, polarizers located on a diagonal line have transmitting directions orthogonal to each other, and polarizers adjacent to each other have a difference of 45°. That is, polarizers in four directions at intervals of 45° are provided. This substitutes for the linear polarizing plate 70. In the light detecting layer 118 disposed below, polarized light information in four directions at intervals of 45° can be obtained in respective regions corresponding to the four regions 122a, 122b, 122c, and 122d. An image sensor that simultaneously obtains the polarized light information in the four directions as two-dimensional data can be implemented by further arranging a predetermined number of such polarizer arrangements in a vertical direction and a horizontal direction and connecting a peripheral circuit that controls charge readout timing.
The imaging element 110 illustrated in the figure is provided with the color filter layer 116 between the wire grid polarizer layer 114 and the light detecting layer 118. The color filter layer 116, for example, includes an arrangement of filters that respectively transmit light of red, green, and blue in correspondence with respective pixels. Polarized light information is thereby obtained on a color-by-color basis according to a combination of the directions of the polarizers in the wire grid polarizer layer 114 located on an upper side and the colors of the filters in the color filter layer 116 located on a lower side. That is, because polarized light information in the same direction and the same color is discretely obtained on the image plane, polarized light images in respective directions and respective colors are obtained by interpolating the polarized light information as appropriate.
In addition, an unpolarized color image can also be reproduced by performing operation between polarized light images in the same color. An image obtaining technology using wire grid polarizers is also disclosed in, for example, Japanese Patent Laid-Open No. 2012-80065 or the like. However, in the present embodiment, polarized light luminance images are basically used, and thus, the color filter layer 116 can be omitted when color images are not necessary for other purposes. In addition, the polarizers are not limited to the wire grid type, and linear dichroic polarizers or the like can also be used.
The zenith angle θ of the normal vector n at each position is an angle formed between a light ray reaching the imaging device 12 mounted on the robot 4 and the normal vector n. Thus, between a distance d from the robot 4 and the zenith angle θ, a relation of d=H tan θ holds, where H is the height of the imaging device 12. That is, the larger the distance d, the larger the zenith angle θ. In the illustrated example, a zenith angle θ1 at a most distant position, a zenith angle θ2 at an intermediate position, and a zenith angle θ3 at a most adjacent position have a relation of θ1>θ2>θ3.
On the other hand, in a case where the play field 14 has a material that makes specular reflection dominant, zenith angles at which a high degree of polarization is obtained are in a range of approximately 0.6 to 1.3 rad, as illustrated in the upper part of
On the other hand, consideration will be given to a case where there is a level difference in the vicinity of the robot 4, as illustrated in
In the illustrated example, a zenith angle θ4 of the normal vector n′ is close to 90° in a region of the slope 152, and thus the region of the slope 152 is excluded from the high polarization degree region. In a more outward region without a slope, the normal vector n is in the same direction as the play field 14. However, because of a large distance from the robot 4 to the more outward region, a zenith angle θ5 is large, and the more outward region is not included in the high polarization degree region either. As a result, the high polarization degree region 160 in this case has a shape in which a part of the circumference is missing. While the figure illustrates a case where the slope 152 is present on the outside of the peripheral edge, the same is true for cases where a surface is discontinued as at an edge of a table or the like and where there is a wall.
The information processing apparatus 10 detects the position B of the peripheral edge of the play field 14 on the basis of the missing part and controls the robot 4 so that the robot 4 does not go beyond the peripheral edge of the play field 14. Incidentally, in reality, only a further limited part of the high polarization degree region 160 may be obtained due to a field of view of the imaging device 12. However, when the imaging device 12 is provided so as to perform photographing in the traveling direction of the robot 4, it is possible to detect the missing part in the traveling direction, or in turn the peripheral edge of the play field 14. In addition, in the photographed image, the high polarization degree region 160 appears in a perspectively projected shape. However, a shape in an overlooked state is obtained easily by coordinate transformation used in computer graphics.
That is, a bird's-eye view as illustrated in the figure is obtained when an image on the image plane is once back projected onto the play field 14 in a three-dimensional space on the basis of a positional relation between the imaging device 12 and the surface of the play field 14 and the play field 14 is projected into an overlooking camera coordinate system. The degree of polarization is a parameter independent of color information and is thus not affected even when the play field 14 is an object of a material having a high light ray transmittance such as glass or acrylic resin.
When the robot 4 is made to walk on a glass table, for example, no distinction may be made between the floor and the table on the basis of a color image alone, and the robot 4 may, for example, fall to the floor as a result of failing to recognize an edge of the table. Even when the table and the floor have the same color, a similar case may occur depending on the state of ambient light. The present embodiment is based on the degree of polarization and introduces a simple criterion, that is, the shape of the high polarization degree region. Thus, the present embodiment can detect the peripheral edge of the play field 14 with ease and high robustness as compared with ordinary object detection using a color image, and can thus ensure the safety of the robot 4 or the user.
The CPU 23 controls the whole of the information processing apparatus 10 by executing an operating system stored in the storage unit 34. The CPU 23 also executes various kinds of programs read from the removable recording medium and loaded into the main memory 26 or downloaded via the communicating unit 32. The GPU 24 has functions of a geometry engine and functions of a rendering processor. The GPU 24 performs rendering processing according to a rendering instruction from the CPU 23 and stores the data of a display image in a frame buffer not illustrated.
Then, the display image stored in the frame buffer is converted into a video signal, and the video signal is output to the output unit 36. The main memory 26 is formed by a RAM (Random Access Memory). The main memory 26 stores a program and data necessary for processing. Incidentally, as described above, various applications are possible for the present embodiment. Thus, a part of the configuration illustrated in the figure may be omitted or replaced with another circuit depending on an output form of a processing result according to an application.
The information processing apparatus 10 includes an image obtaining section 50 that obtains data of a photographed image from the imaging device 12; an image data storage section 52 that stores the obtained data of the image; an image analyzing section 54 that performs image analysis including detection of the peripheral edge of the play field; and an output data generating section 56 that generates data to be output, by using a result of the analysis.
The image obtaining section 50 is implemented by the input unit 38, the CPU 23, and the like in
The image obtaining section 50 may further obtain the data of an ordinary photographed color image, depending on the objective of information processing or the content of image analysis as in a case where the photographed image is used in the output display image or the like. In addition, in a case where the imaging device 12 is a stereo camera including two cameras arranged at known intervals, the image obtaining section 50 may obtain the data of a stereo image having a left-right parallax, the stereo image being photographed by the cameras. The image obtaining section 50 sequentially stores the data of the obtained photographed image in the image data storage section 52.
The image analyzing section 54 is implemented by the CPU 23, the GPU 24, and the like in
When there are polarized light images in three directions φ1, φ2, and φ3, a continuous function as illustrated in
Normally, the play field 14 is on the lower side of a field of view. Thus, a lower half region of the photographed image, for example, may be set as a polarization degree operation target. In a case where the posture of the imaging device 12 can be obtained separately by an acceleration sensor or the like, the operation target region may be adjusted on the basis of information regarding the posture. The high polarization degree region evaluating section 60 extracts a region on the image plane in which region degrees of polarization higher than a threshold value set in advance are obtained, and evaluates the shape of the high polarization degree region when the high polarization degree region is overlooked. At this time, as described above, ordinary viewpoint transformation processing is performed in computer graphics.
Due to the field of view of the imaging device 12, the high polarization degree region is basically a part of a circumference. A portion parted by the field of view is clear on the basis of positions on the image plane. Hence, the high polarization degree region evaluating section 60 determines whether or not there is another missing part, that is, a missing part in an arc shape corresponding to an angle of view. The presence or absence of the loss can be determined by, for example, preparing a template image of the arc shape in advance and obtaining a difference by pattern matching. Then, when there is a missing part, information indicating that the peripheral edge of the play field 14 is near is supplied to the output data generating section 56 together with information regarding a direction, a distance, a shape, and the like thereof.
Incidentally, when the robot 4 moves or changes the direction, the imaging device 12 can photograph images in various fields of view. Using this, the high polarization degree region evaluating section 60 may create an environment map on which the range of the play field 14 is indicated in a two-dimensional or three-dimensional unified coordinate system, by accumulating and recording the position of the imaging device 12 when each photographed image is obtained and the shape of the high polarization degree region obtained at the position.
When the range of the play field 14 can once be defined as a result of determining the peripheral edge by obtaining or interpolating the peripheral edge in all directions, positional relation between the robot 4 and the peripheral edge can thereafter be grasped without the evaluation based on the degrees of polarization being performed. In addition to the above-described processing, the image analyzing section 54 may perform ordinary image analysis processing such as obtainment of the normal vector of an object present in the vicinity of the robot 4. The position and posture of the object may be determined accurately by, for example, identifying the position of the object with use of a stereo image and integrating the position of the object with information regarding the normal vector.
The output data generating section 56 is implemented by the CPU 23, the GPU 24, the output unit 36, and the like in
When the peripheral edge of the play field 14 is not approaching, the output data generating section 56 performs original information processing and outputs a control signal or the data of a display image. The content of the information processing performed at this time and the kind of the output data are not particularly limited. For example, a control signal for the robot to lift an object present in the vicinity thereof or slip through objects present in the vicinity thereof to walk is generated and transmitted to the robot. Alternatively, an image representing augmented reality or virtual reality in a field of view corresponding to the line of sight of the user wearing the head-mounted display may be generated and transmitted to the head-mounted display. In these pieces of processing, the data of the photographed image stored in the image data storage section 52 and a result of object detection by the image analyzing section 54 may be used as appropriate.
In actuality, however, the high polarization degree region may be obtained at a higher frequency. In addition, only a part of the high polarization degree region which part is included in the field of view of the imaging device 12 is obtained by one time of photographing, as described above. For example, when the robot 4 is moving to the peripheral edge side of the play field 14 at a timing when the robot 4 is present at (x2, y2), a missing part in the high polarization degree region is generated by photographing in that direction. The robot 4 can be prevented from deviating from the play field 14 by making the robot 4 change the direction accordingly.
On the other hand, in a case where the position coordinates (x1, y1), (x2, y2), . . . (x9, y9) and the orientation of the imaging device 12 can be obtained by a sensor included in the robot 4, an imaging device externally photographing the robot 4, or the like, the range of the play field 14 can be obtained by accumulating and storing the shape of the high polarization degree region obtained at each position. That is, as illustrated in the figure, when the high polarization degree regions are represented so as to be superimposed on position coordinates in the unified coordinate system, a union of the high polarization degree regions represents the range of the play field 14.
Operation that can be implemented by the configuration described above will next be described.
Then, the high polarization degree region evaluating section 60 extracts a region in which degrees of polarization higher than the threshold value are obtained, and obtains the shape of the high polarization degree region when the high polarization degree region is overlooked, by performing viewpoint transformation. Then, when there is a missing part in the arc shape in a visual field range (Y in S14), data for avoiding deviation from the play field 14 is output via the output data generating section 56 (S16). For example, a control signal to change direction is transmitted to the robot 4. Alternatively, a warning image for giving an instruction to change direction is superimposed on the image of the content or the like and transmitted to the display device of the head-mounted display or the like.
Sound indicating a warning may be transmitted to the head-mounted display, a speaker not illustrated, or the like. Alternatively, the warning may be indicated by vibration, by transmitting a control signal that vibrates a vibrator included in a controller held by the user or the head-mounted display worn by the user. The warning of the deviation may be given by outputting one of an image, sound, and vibration or a combination of two or more thereof from an output device in a form recognizable by the user as described above.
When there is no missing part in the high polarization degree region (N in S14), data resulting from normal processing such as a control signal according to an original purpose or a display image or sound of electronic content is generated and output (S18). When there is no need to stop the processing due to a request from the user or the like (N in S20), the processing from S10 to S18 is repeated. All of the processing is ended as a need to stop the processing arises (Yin S20).
According to the present embodiment described above, the imaging device provided to the robot capable of free movement or the head-mounted display worn by the user photographs the polarized light images. Then, when, as viewed from a photographing viewpoint, there is a missing part in the shape of a region in which the degree of polarization is higher than the threshold value when the region is overlooked, it is determined that normal vectors are not uniform, that is, that a non-flat part is present within the angle of view. Then, a control signal to change direction is transmitted to the robot, or a warning image for giving an instruction to change direction is displayed on the head-mounted display.
It is thus possible to prevent dropping from the play field or toppling over as a result of continuing movement while failing to recognize a level difference or an edge of a surface. The present embodiment uses the degree of polarization independent of color information, and thus accurately detects the peripheral edge part of a transparent surface of glass or the like or a surface of a color difficult to distinguish from those of surrounding objects. In addition, there is less effect of an error resulting from operation or the like because determination can be made by a simple criterion of whether or not there is a missing part in the arc shape. As a result, safety can be improved easily for free movement of the moving body such as the robot or the user.
The present invention has been described above on the basis of the embodiment thereof. The foregoing embodiment is illustrative, and it is to be understood by those skilled in the art that combinations of constituent elements and processing processes of the embodiment are susceptible of various modifications and that such modifications also fall within the scope of the present invention.
For example, the present embodiment detects the peripheral edge of the play field by using an effect of a shape change from a flat surface on the shape of the high polarization degree region. In this case, when a flat surface of the same material whose polarization state does not change continues, the flat surface is regarded as the play field. On the other hand, when the fact that a change in the material changes the polarization state and affects the shape of the high polarization degree region is used, the range of the play field can be limited even on the same flat surface. For example, as illustrated in
Hence, when, for example, the play field is set to be a mat or a flat board of a material that makes a specular reflection component dominant, and is laid on a surface of a material that makes diffuse reflection dominant, the high polarization degree region is missing outside the play field even on the same flat surface. It is thus possible to set the movable range of the moving body in various manners. For example, it is possible to obtain various effects other than safety by, for example, imparting game characteristics to the shape itself of the play field or preventing the robot or the user from going off the photographing angle of view in a mode in which the robot or the user is externally photographed and tracked.
10 Information processing apparatus, 12 Imaging device, 23 CPU, 24 GPU, 26 Main memory, 50 Image obtaining section, 52 Image data storage section, 54 Image analyzing section, 56 Output data generating section, 58 Polarization degree obtaining section, 60 High polarization degree region evaluating section
As described above, the present invention is applicable to various kinds of information processing apparatuses and systems such as a robot control device and an electronic content processing device.
Wada, Shinya, Ishida, Takayuki
Patent | Priority | Assignee | Title |
Patent | Priority | Assignee | Title |
10769951, | Sep 30 2015 | Sony Corporation | Image processing apparatus, image processing method, and vehicle control system to determine the presence of an object from an image of a peripheral area of a moving body |
10789569, | Nov 27 2017 | Amazon Technologies, Inc.; Amazon Technologies, Inc | System to determine item footprint |
9025027, | Sep 16 2010 | Ricoh Company, Ltd. | Object identification device, moving object controlling apparatus having object identification device, information presenting apparatus having object identification device, and spectroscopic image capturing apparatus |
9599818, | Jun 12 2012 | SONY INTERACTIVE ENTERTAINMENT INC | Obstacle avoidance apparatus and obstacle avoidance method |
9630105, | Sep 30 2013 | SONY INTERACTIVE ENTERTAINMENT INC | Camera based safety mechanisms for users of head mounted displays |
20120069181, | |||
20120242835, | |||
20130058528, | |||
20130136306, | |||
20130328928, | |||
20150094142, | |||
20160267348, | |||
20160307053, | |||
20180005012, | |||
20180301032, | |||
20180361240, | |||
20200184233, | |||
20230062698, | |||
EP2674893, | |||
JP2011150688, | |||
JP2014016981, | |||
JP2017130793, | |||
JP2018099523, | |||
JP5580855, | |||
WO2017056822, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Sep 04 2018 | SONY INTERACTIVE ENTERTAINMENT INC. | (assignment on the face of the patent) | / | |||
Feb 10 2021 | WADA, SHINYA | SONY INTERACTIVE ENTERTAINMENT INC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 055384 | /0987 | |
Feb 10 2021 | ISHIDA, TAKAYUKI | SONY INTERACTIVE ENTERTAINMENT INC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 055384 | /0987 |
Date | Maintenance Fee Events |
Feb 24 2021 | BIG: Entity status set to Undiscounted (note the period is included in the code). |
Date | Maintenance Schedule |
Aug 27 2027 | 4 years fee payment window open |
Feb 27 2028 | 6 months grace period start (w surcharge) |
Aug 27 2028 | patent expiry (for year 4) |
Aug 27 2030 | 2 years to revive unintentionally abandoned end. (for year 4) |
Aug 27 2031 | 8 years fee payment window open |
Feb 27 2032 | 6 months grace period start (w surcharge) |
Aug 27 2032 | patent expiry (for year 8) |
Aug 27 2034 | 2 years to revive unintentionally abandoned end. (for year 8) |
Aug 27 2035 | 12 years fee payment window open |
Feb 27 2036 | 6 months grace period start (w surcharge) |
Aug 27 2036 | patent expiry (for year 12) |
Aug 27 2038 | 2 years to revive unintentionally abandoned end. (for year 12) |