Embodiments of the technology involve apparatus and methods for control of displaying of images. In an example, an apparatus may include an image display, a sensor to detect posture of the image display and a processor to control sequentially displaying images of a group of images on the image display based on changes in the detected posture. The processor may control a display of a posture indicator on the image display such that the indicator may represent a relation between a change in the detected posture and an image of the group of images. Optionally, the indicator may be represented by a tilt meter. Moreover, in some embodiments, the sensor may be implemented with a gyroscopic sensor.
|
1. A method of a processor to control a display of images, said method comprising:
detecting posture of an image display device with a sensor;
sequentially displaying images of a group of images on the image display device based on changes in the detected posture; and
displaying manipulation support information, the manipulation support information representing a relation between a change in the detected posture and an image of the group of images,
in which a relationship between a rotation angle associated with the posture of the image display device and a manipulation range representative of an overall angular range is determined according to a number of images to be displayed, and
said manipulation support information indicating a position of the image of the group of images within the manipulation range.
14. An apparatus for control of displaying of images comprising:
an image display;
a sensor to detect posture of the image display; and
a processor to (i) control sequentially displaying images of a group of images on the image display based on changes in the detected posture, and (ii) control a display of manipulation support information, the manipulation support information representing a relation between a change in the detected posture and an image of the group of images,
in which a relationship between a rotation angle associated with the posture of the image display device and a manipulation range representative of an overall angular range is determined according to a number of images to be displayed, and
said manipulation support information indicating a position of the image of the group of images within the manipulation range.
26. A system for control of displaying images comprising:
a display means for image displaying;
a posture means for sensing posture of the means for image displaying;
a control means for (i) controlling sequentially displaying images of a group of images on the display means based on changes in the sensed posture, (ii) controlling a display of manipulation support information, the manipulation support information representing a relation between a change in the sensed posture and an image of the group of images,
in which a relationship between a rotation angle associated with the posture of the display means and a manipulation range representative of an overall angular range is determined according to a number of images to be displayed, and
said manipulation support information indicating a position of the image of the group of images within the manipulation range.
2. The method of
3. The method of
4. The method of
5. The method of
7. The method of
10. The method of
11. The method of
12. The method of
13. The method of
16. The apparatus of
17. The apparatus of
18. The apparatus of
19. The apparatus of
20. The apparatus of
21. The apparatus of
22. The apparatus of
23. The apparatus of
24. The apparatus of
25. The apparatus of
27. The system of
28. The system of
29. The system of
|
The present application is a national phase entry under 35 U.S.C. §371 of International Application No. PCT/JP2011/001405 filed Mar. 10, 2011, published on Sep. 22, 2011 as WO 2011/114667 A1, which claims priority from Japanese Patent Application No. JP 2010-063574 filed in the Japanese Patent Office on Mar. 19, 2010.
The present invention relates to an image processing device, and more particularly, to an image processing device and an image processing method for displaying an image, and a program for causing a computer to execute the method.
In recent years, imaging devices such as a digital still camera or a digital video camera (e.g., a camera-integrated recorder) for imaging a subject such as a person or an animal to generate image data and recording the image data as image content have been used.
For example, there is a reproduction apparatus for performing image advancing by user manipulation using a manipulation member and sequentially displaying a plurality of images. Also, there is a reproduction apparatus in which a user changes a posture of the reproduction apparatus to change displayed content of a display unit.
For example, an information processing device for obtaining a movement amount or a rotation amount of a body and instructing, for example, to scroll displayed content of a display unit according to the amount has been proposed (e.g., see Patent Literature 1).
Some embodiments of the present technology may involve a method of a processor to control a display of images. The method may involve detecting posture of an image display device with a sensor. The method may also involve sequentially displaying images of a group of images on the image display device based on changes in the detected posture. The method may also involve displaying a posture indicator on the image display device, the indicator representing a relation between a change in the detected posture and an image of the group of images.
Optionally, the indicator may include a graphic tilt meter. In some embodiments, the processor may associate a normalized tilt angle with an image of the sequentially displayed images. The normalized tilt angle may comprise an angle determined as a function of image capture angle information and a maximum tilt angle. In some embodiments, the image capture angle information may comprise a range of captured image angles. Still further, the maximum tilt angle may comprise a display viewing angle limit. In some cases, the method may also involve sequentially displaying entrance images for groups of images on the image display device based on changes in the detected posture. Optionally, the sensor may comprise a gyroscopic sensor.
Some embodiments of the present technology may involve an apparatus for control of displaying of images. The apparatus may include an image display, a sensor to detect posture of the image display and a processor to control sequentially displaying images of a group of images on the image display based on changes in the detected posture. The processor may control a display of a posture indicator on the image display such that the indicator may represent a relation between a change in the detected posture and an image of the group of images.
Optionally, the indicator may comprise a tilt meter. Still further, the processor may control associating a normalized tilt angle with an image of the sequentially displayed images. The detection of posture may comprise a detection of angular velocity. Moreover, the processor may control changing the display based on the angular velocity. In some cases of the apparatus, the processor may control calculating the normalized tilt angle as a function of image capture angle information and a maximum tilt angle. The image capture angle information may comprise a range of captured image angles. The maximum tilt angle may comprise a display viewing angle limit. In some such embodiments, the processor may control sequentially displaying entrance images for groups of images on the image display device based on changes in the detected posture. The sensor may include a gyroscopic sensor.
In still further embodiments of the technology may involve a system for control of displaying images. The system may include a display means for image displaying. The system may also include a posture means for sensing posture of the means for image displaying. The system may also include a control means for controlling sequentially displaying images of a group of images on the display means based on changes in the sensed posture. In some cases, the control means may control a display of a means for posture indicating on the image display, the means for posture indicating representing a relation between a change in the sensed posture and an image of the group of images. Still further, the control means may control associating of a normalized tilt angle with an image of the sequentially displayed images. Moreover, the control means may control calculating the normalized tilt angle as a function of image capture angle information and a maximum tilt angle. In some cases, such an image capture angle information may comprise a range of captured image angles determined by the posture means.
According to the above-described related art, the displayed content of the display unit can be changed by changing the posture of a device, making it possible for a user to perform a changing manipulation in a state in which the user holds the device by hand.
Here, a case in which image advancing or image returning is performed on a plurality of images by changing posture of a device is assumed. In this case, a user performs a manipulation to change posture of the device while viewing images sequentially displayed on the display unit. Accordingly, it is understood that the user can easily perform the manipulation if he or she can easily recognize an image advancing or image returning timing or a position relationship of display images in an image group while viewing images sequentially displayed on the display unit.
In light of the foregoing, it is desirable to easily perform a manipulation to sequentially display a plurality of images using a manipulation method of changing posture of a device.
Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the appended drawings. Note that, in this specification and the appended drawings, structural elements that have substantially the same function and structure are denoted with the same reference numerals, and repeated explanation of these structural elements is omitted.
The description will be given in the following order.
1. First embodiment (display control: an example in which an indicator or a tilt meter superimposed on an image in a group is displayed in a group image display mode)
2. Variant
The imaging device 100 includes an input/output panel 101 and a shutter button 102. The imaging device 100 is an image processing device that can be carried by a user and the user can view an image displayed on the input/output panel 101. The imaging device 100 is one example of an image processing device defined in the claims.
The input/output panel 101 displays various images and receives a manipulation input from a user by detecting a contact manipulation on the input/output panel 101. The shutter button 102 is a button pressed by a photographer when image data (an imaged image) generated by imaging a subject is recorded as image content (a still image file).
The imaging device 100 includes other manipulation members such as a power switch and a mode switching switch, a lens unit, and the like, but such manipulation members will not be shown and described herein for ease of explanation. Here, the lens unit (not shown) is an optical system including a plurality of lenses for focusing light from the subject, an aperture, and the like.
Here, a change of posture of the imaging device 100 will be described. For example, the user may change rotation angles (i.e., a yaw angle, a pitch angle, and a roll angle) about three orthogonal axes in a state in which the user holds the imaging device 100 by hand. For example, the user may change the posture of the imaging device 100 in a direction indicated by the arrow 302 (the yaw angle), the axis of which is the arrow 300 (or the arrow 301). The change example is shown in
Further, for example, the user may change the posture of the imaging device 100 by moving (sliding) the imaging device 100 along a straight line on a plane in a state in which the user holds the imaging device 100 by hand. For example, the user may change the posture by moving the imaging device 100 in the direction indicated by the arrow 300 or the arrow 301 (movement in a vertical direction in
In
In
The image content storage unit 200 stores image data (an imaged image) generated by the imaging unit (not shown) as an image file (image content (still image content or moving image content)). The image content storage unit 200 supplies the stored image content to the representative image reproduction unit 150 or the group image reproduction unit 160. In the first embodiment of the present invention, an example in which the still image content is used as the image content is shown.
The image management information storage unit 210 stores management information (image management information) on the image content stored in the image content storage unit 200. Using the image management information, reproduction in the representative image display mode and the group image display mode is performed. Here, the representative image display mode is a mode in which a representative image of grouped image contents and an image of non-grouped image content among the image contents stored in the image content storage unit 200 are sequentially displayed according to user manipulation. The group image display mode is a mode in which images of each grouped image content among the image contents stored in the image content storage unit 200 are sequentially displayed according to the change of the posture of the imaging device 100. The image content storage unit 200 and the image management information storage unit 210 may be, for example, one or a plurality of removable recording media, such as discs, such as digital versatile discs (DVDs) or semiconductor memories such as memory cards. The recording media may be embedded in the imaging device 100 or detachably provided in the imaging device 100.
The input/output unit 110 includes a display unit 111 and a manipulation receiving unit 112. The display unit 111 is a display unit for displaying an image supplied from the representative image reproduction unit 150 or the group image reproduction unit 160. Various menu screens or various images are displayed on the display unit 111. The display unit 111 may be, for example, a liquid crystal display (LCD) or an organic electro luminescence (EL) panel.
The manipulation receiving unit 112 is a manipulation receiving unit for receiving content of a manipulation by the user, and supplies a manipulation signal dependent on the received manipulation content to the control unit 140. The manipulation receiving unit 112 corresponds to, for example, the manipulation member such as the shutter button 102 shown in
The posture detection unit 120 detects a change of the posture of the imaging device 100 by detecting acceleration, movement, tilt and the like of the imaging device 100, and outputs posture change information about the detected change of the posture to the analysis unit 130. For example, the posture detection unit 120 detects a movement direction and a movement amount in a specific direction of the imaging device 100, and an angular velocity in the specific direction (an angular velocity when the posture of the imaging device 100 is changed) as the change of the posture of the imaging device 100. The posture detection unit 120 may be realized by a gyro sensor (angular velocity sensor) or an acceleration sensor. The posture detection unit 120 is an example of a detection unit defined in the claims.
The analysis unit 130 analyzes the change amount (e.g., the movement direction and the movement amount) of the posture of the imaging device 100 based on the posture change information output from the posture detection unit 120 to generate analysis information for performing display switching on the image or the manipulation support information displayed on the display unit 111. The analysis unit 130 outputs the analysis information to the group image reproduction unit 160.
The control unit 140 controls each unit of the imaging device 100 based on the manipulation content from the manipulation receiving unit 112. For example, the control unit 140 sets the representative image display mode when the manipulation receiving unit 112 receives a representative image display mode setting manipulation and sets the group image display mode when the manipulation receiving unit 112 receives a group image display mode setting manipulation.
Further, when the group image display mode setting manipulation is received, the control unit 140 performs a control to sequentially display, on the display unit 111, the images in the group corresponding to the representative image displayed on the display unit 111 upon the setting manipulation. When the group image display mode is set, the control unit 140 performs a control to display, on the display unit 111, the manipulation support information indicating a relationship between the change of the posture of the imaging device 100 and the display state in the display unit 111. The manipulation support information is information for supporting a manipulation to change the display state in the display unit 111 and is, for example, an indicator 660 shown in
When the representative image display mode has been set and an instruction manipulation to instruct display switching (e.g., image advancing and image returning) of the image displayed on the display unit 111 is received, the control unit 140 instructs the representative image reproduction unit 150 to switch the image display based on the instruction manipulation. Further, when a group image display mode setting manipulation is performed, the control unit 140 acquires the image management information (the manipulation method 215, or the like, shown in
The representative image reproduction unit 150 displays, on the display unit 111, image content corresponding to the representative image and non-grouped image content among the image contents stored in the image content storage unit 200 under control of the control unit 140. Specifically, when the representative image display mode has been set, the representative image reproduction unit 150 acquires the image content corresponding to the representative image and non-grouped image content from the image content storage unit 200. Subsequently, the representative image reproduction unit 150 decodes the acquired image content and renders a display image in the rendering memory 170 based on the decoded image content. The representative image reproduction unit 150 sequentially displays one of the images rendered in the rendering memory 170 on the display unit 111 under control of the control unit 140. The representative image reproduction unit 150 displays, on the display unit 111, the representative image, and manipulation support information (e.g., the manipulation support information 261 shown in
The group image reproduction unit 160 displays the grouped image contents among the image contents stored in the image content storage unit 200 on the display unit 111 in a group unit under control of the control unit 140. Specifically, when the group image display mode is set, the group image reproduction unit 160 acquires each image content belonging to the group corresponding to the representative image displayed upon the group image display mode setting manipulation from the image content storage unit 200. Further, the group image reproduction unit 160 acquires the image management information associated with each image content belonging to the group from the image management information storage unit 210. Subsequently, the group image reproduction unit 160 outputs the acquired image management information to the manipulation method determination unit 180. Moreover, for example, the group image reproduction unit 160 decodes the acquired image content and renders a display image in the rendering memory 170 based on the decoded image content. In this case, the group image reproduction unit 160 arranges display images and renders the images in the rendering memory 170, for example, according to a predetermined rule.
When the group image display mode setting manipulation is received, the group image reproduction unit 160 displays each image belonging to a group to be displayed and manipulation support information on the display unit 111 to be associated with each other. In this case, for example, the group image reproduction unit 160 changes content of the manipulation support information based on the change of the posture of the imaging device 100 detected by the posture detection unit 120. That is, the group image reproduction unit 160 sequentially displays, on the display unit 111, one image from among the images rendered in the rendering memory 170 based on the image display switching instruction output from the analysis unit 130 (an image advancing or returning instruction).
The rendering memory 170 is a rendering buffer for holding the images rendered by the representative image reproduction unit 150 or the group image reproduction unit 160, and supplies the rendered images to the representative image reproduction unit 150 or the group image reproduction unit 160.
The manipulation method determination unit 180 determines the manipulation method when grouped image content is in a group unit displayed on the display unit 111, based on the image management information output from the group image reproduction unit 160. For example, the manipulation method determination unit 180 normalizes the angle information assigned to a plurality of image contents, for each image content, using a range in which the posture of the imaging device 100 is changed, as a reference. The manipulation method determination unit 180 determines the manipulation method by associating the position of the posture of the imaging device 100 with the plurality of image contents, for each image content, based on the normalized angle information, and outputs the determined manipulation method to the group image reproduction unit 160. A method of determining the manipulation method will be described in detail with reference to
(Stored Content of Image Management Information Storage Unit)
Identification information for identifying each group is stored in the group identification information 211. Here, the group is an image group including a plurality of image contents associated with one another, the order of which is specified based on a predetermined rule. In the first embodiment of the present invention, an example in which an image group including a plurality of image contents, and image contents not in the image group are assigned group identification information to identify the image contents not in the image group and the image group is shown.
Identification information (image content identification information) for identifying each image content stored in the image content storage unit 200 is stored in the image content identification information 212. For example, the image content identification information of each image content is stored in an image advancing or returning order (e.g., an order of photographing time (record date)).
Angle information corresponding to an imaging operation when the image content stored in the image content storage unit 200 is generated is stored in the angle information 213. For example, when a multi-view image is photographed using the imaging device 100 and the multi-view image generated by the photograph is stored in the image content storage unit 200, an angle upon photographing (an angle when a predetermined position is used as a reference) is stored as the angle information. “Absence” is shown in the field of the image content not assigned the angle information. Here, the multi-view image is a plurality of images generated by an imaging operation changing a view with respect to a target subject (e.g., a face of a person) by moving the imaging device 100 using the target subject as a reference. The imaging operation will be described in detail with reference to
Representative image information for specifying a representative image among a plurality of grouped image contents is stored in the representative image 214. In the example shown in
A manipulation method when a plurality of grouped image contents are sequentially displayed in the case in which the group image display mode has been set is stored in the manipulation method 215. The manipulation method is a manipulation method relating to the change of the posture of the imaging device 100. For example, “horizontal rotation” refers to a manipulation to sequentially display a plurality of image contents by rotating the imaging device 100 in the direction indicated by the arrow 302 about the arrow 300 (301) shown in
The type relating to a group (image group) including a plurality of image contents the order of which is specified based on a predetermined rule and that are associated with each other is stored in the group type 216. In
The information stored in the group identification information 211, the group presence 213, the representative image 214, the manipulation method 215 and the group type 216 may be recorded in an image file (image content). For example, such information can be recorded as attribute information relating to a still image file. The information recorded in the image content may be used upon reproduction of the image content.
As shown in
In the group type display area 262, the type of the group corresponding to the displayed representative image is displayed. For example, when the type of the group corresponding to the displayed representative image is a group relating to the multi-view image, “multi-view” is displayed in the group type display area 262.
In the image size display area 263, an image size of each image belonging to the group corresponding to the displayed representative image (e.g., an image size of one image) is displayed.
The manipulation support information 261 is a manipulation button for supporting a manipulation to set the group image display mode upon setting of the representative image display mode. That is, the user can set the group image display mode by pressing the manipulation support information 261. Further, the same letters as the group type in the group type display area 262 is displayed in a rectangle corresponding to the manipulation support information 261.
Further, as shown in
Thus, when the representative image display mode has been set, a display aspect of the representative image is changed for each image group based on correlativity (e.g., corresponding to group type) between images in the image group associated with the representative image displayed on the input/output panel 101, and then the representative image is displayed.
As shown in
In
In
In
On the other hand, when the non-grouped image contents (e.g., #6 and #13) are displayed on the input/output panel 101 as shown in
Here, a case in which a group image display mode setting manipulation (the manipulation to press the manipulation support information 261) is performed in a state in which the representative image (#9) of the grouped image content has been displayed on the input/output panel 101 is assumed. In this case, the group image reproduction unit 160 acquires image contents belonging to the group corresponding to the representative image displayed when the group image display mode setting manipulation is performed, from the image content storage unit 200. Further, the group image reproduction unit 160 acquires image management information associated with each image content belonging to the group from the image management information storage unit 210. Subsequently, the manipulation method determination unit 180 determines an image that is an object to be displayed (an image to be displayed) selecting from each image content belonging to the group based on the acquired image management information as well as determining a manipulation method for displaying these images to be displayed. The method of determining this images to be displayed and the method of determining the manipulation method thereof will be described in detail with reference to
Subsequently, the group image reproduction unit 160 renders each image in the rendering memory 170 based on the acquired image content. In this case, the group image reproduction unit 160 renders each image in the rendering memory 170 according a predetermined rule (e.g., the order of image content identification information). An image rendering example in the rendering memory 170 is shown in
In
In
In
Here, as shown in
In
In
Specifically, the images (C0 to C2) 510 to 512 are imaged images when the right oblique side of the person 500 is photographed (when the angle information Ai in the imaging operation is −60 degrees, −55 degrees, and −25 degrees). Further, the images (C3 to C5) 513 to 515 are imaged images when the left oblique side of the person 500 is photographed (when the angle information Ai in the imaging operation is +40 degrees, +55 degrees, and +60 degrees). In
Thus, the images (C0 to C5) 510 to 515 generated by changing a view with respect to the target subject (the face of the person 500) are grouped as a multi-view image and stored in the image content storage unit 200. Further, such image management information (e.g., the angle information and the group type) is stored in the image management information storage unit 210. When the images (C0 to C5) 510 to 515 are displayed using the imaging device 100, the images (C0 to C5) 510 to 515 in the group can be sequentially displayed by horizontally rotating the imaging device 100.
Here, the images (C0 to C5) 510 to 515 are images obtained by imaging a face of the same person from different views. Accordingly, the user can obtain the feeling of viewing a three dimensional object (the face of the person 500) by rapidly performing the user manipulation to tilt the imaging device 100 to the left and right and rapidly performing image advancing or image returning. That is, the user manipulation to tilt the imaging device 100 to the left and right gives the user the feeling of viewing a pseudo-three-dimensional image.
Thus, it is possible to enjoy a pseudo-three-dimensional image as a profound image by performing high-speed image advancing or image returning by a gesture manipulation of the user on the image group obtained by photographing the target subject from several angles. That is, when the user desires to three-dimensionally view the object, such as a face of a person or a vehicle, he or she can easily view a pseudo-three-dimensional image of the object through a manual manipulation.
When the image group is displayed as such, a determination of the angle information (the overall rotation angle range and the reference angle) associated with the manipulation method is important. Hereinafter, a method of determining the angle information associated with the manipulation method will be described in detail with reference to the drawings.
In
Vi=(gamma)*j Equation 1
Here, (gamma) is a value indicating tilt accuracy that can be maintained by a person, for example, without hand shaking. Such a value may be determined in consideration of accuracy or performance of the posture detection unit 120 (a tilt detection device). Further, j is a value satisfying (integer of −m<=j<=m), and i is an integer satisfying 0<=i<=n. In the example shown in
Here, the angle information Ai assigned to each image Ci in the image group is an angle with respect to the target subject in an imaging operation. Accordingly, it is preferable to associate each image Ci with the tilt angle Vi based on the angle information Ai. The angle information Ai is normalized so that the angle information Ai corresponds to the tilt angle Vi. Specifically, the normalization angle information Bi for which the angle information Ai has been normalized may be obtained by Equation 2:
Bi=(Vmax/Aabs)*Ai Equation 2
Here, Vmax is an absolute value of a maximum tilt angle and is an angle at which a user is allowed to view the display screen. Vmax is limited to, for example, a viewing angle of the screen. Further, Aabs is a value (absolute value) indicating a range of the angle information Ai.
In the example shown in
Accordingly, in the example shown in
In
The image Ci can be properly assigned to the tilt angle (reference angle) Vi by normalizing the angle information in the imaging operation associated with the image Ci.
In
In
Here, in the example shown in
Specifically, the image Ci associated with the normalization angle information Bi closest to the tilt angle (reference angle) Vi is selected as an image CCi for each tilt angle (reference angle) Vi from the image group (image Ci (0<=i<=n)) for which the normalization angle information Bi has been calculated.
In
Specifically, the image C0 associated with the normalization angle information B0(=−90 degrees) closest to the tilt angle V0(=−90 degrees) is selected and associated with the tilt angle V0, as an image CC0. Similarly, the image C2 associated with the normalization angle information B2(=−37.5 degrees) closest to the tilt angle V1(=−45 degrees) is selected and associated with the tilt angle V1, as an image CC1. Further, the image C3 associated with the normalization angle information B3(=+60 degrees) closest to the tilt angle V2(=0 degree) is selected and associated with the tilt angle V2, as an image CC2. Further, the image C4 associated with the normalization angle information B4(=+82.5 degrees) closest to the tilt angle V3(=45 degrees) is selected and associated with the tilt angle V3, as an image CC3. Further, the image C5 associated with the normalization angle information B5(=+90 degrees) closest to the tilt angle V4(=90 degrees) is selected and associated with the tilt angle V4 as an image CC4.
That is, CCi(={C0, C2, C3, C4, C5}={CC0, CC1, CC2, CC3, CC4}) is selected from the image group (image Ci (0<=i<=5)).
The selected images CCi associated with the tilt angles Vi are sequentially displayed according to the tilt of the imaging device 100. For example, when the posture of the imaging device 100 is in a state of the tilt angle (reference angle) Vi, the image CCi associated with the tilt angle (reference angle) V is displayed. The image CCi continues to be displayed until the posture of the imaging device 100 becomes the tilt angle (reference angle) Vi−1 or the tilt angle (reference angle) Vi+1.
In
The manipulation support information 520 is an indicator indicating a relationship between the images belonging to the group to be displayed (all extracted images to be displayed) and images displayed on the input/output panel 101 when the group image display mode has been set. Specifically, the manipulation support information 520 indicates the relationship between all the extracted images to be displayed (C0 and C2 to C5) 510, and 512 to 515 and one image displayed on the input/output panel 101. For example, a rectangular area in the manipulation support information 520 corresponding to one image displayed on the input/output panel 101 among all the images (C0 and C2 to C5) 510 and 512 to 515 has a different display aspect from other areas. In the example shown in
That is, in the state shown in
Thus, when the group image display mode has been set, the total number of images to be displayed and a position of a display image in all the images to be displayed (a position in a display order) can be easily recognized by displaying both the image to be displayed and the indicator.
In the foregoing, the example in which the angle information assigned to each image is normalized, the images are assigned to the angles at certain intervals, and the images are displayed according to the angles has been shown. Here, the tilt angle (reference angle) may be determined according to the interval of the calculated normalization angle information of each image and each image may be displayed according to the angle. Further, when the indicator is displayed as the manipulation support information, the display area of the indicator may be calculated according to the interval of the calculated normalization angle information of each image. A method of calculating the display area of the indicator and a display example are shown in
(Variant of Indicator Display)
In
In
Wi=BBi/Vmax*L Equation 3
Here, L is a value indicating a length of the indicator displayed on the input/output panel 101. Further, BBi is a value calculated by the above-described normalization.
Here, the display area Wi of the indicator is a value indicating a length from a left end in the indicator (a length L) shown in
In
In
In
In the above description, the case in which image contents stored to be associated with the angle information are displayed has been described by way of example. However, a case in which image contents in which angle information is not recorded in an imaging operation are displayed is assumed. Hereinafter, the case in which the image contents in which angle information is not recorded in the imaging operation is displayed will be described.
Here, if the number of images to be displayed is great in the case in which image content in which angle information is not recorded in an imaging operation is displayed, image advancing or image returning is likely to be performed even when a tilt manipulation of the imaging device 100 by a user is small. In this case, unintentional image advancing or image returning is assumed to be performed. In the case in which image content in which the angle information is not recorded in the imaging operation is displayed, the number of images to be displayed is determined according to a range in which the tilt manipulation of the imaging device 100 can be performed by the user, and only the images to be displayed are displayed.
In
For example, when the manipulation method determination unit 180 determines the number M of images to be displayed, the manipulation method determination unit 180 may obtain the number M using Equation 4.
M=Vmax/(gamma)+1 Equation 4
Here, M is a value indicating the number of images that can be displayed. Further, Vmax is an absolute value of a maximum tilt angle, similar to Equation 2. Further, (gamma) is a value indicating tilt accuracy that can be maintained by a person, for example, without hand shaking, similar to Equation 1.
In the example shown in
Here, in the example shown in
In
For example, when the manipulation method determination unit 180 extracts the images to be displayed, Equation 5 is calculated for each image Ci belonging to the group to be displayed.
i mod D<(D−M) Equation 5
Here, x mod y is a value indicating x divided by y. Further, D is the number of images to be displayed and M is the number of images that can be displayed.
Specifically, the manipulation method determination unit 180 determines whether i mod D<(D−M) is satisfied for each image Ci belonging to the group to be displayed. The manipulation method determination unit 180 removes the image Ci for which i mod D<(D−M) is satisfied from among the images to be displayed. That is, the manipulation method determination unit 180 extracts the image Ci for which i mod D<(D−M) is not satisfied, as the image to be displayed, and uses the extracted image Ci as an image CCi.
For example, in the example shown in
In
In
In the foregoing, the example in which the images to be displayed, and the manipulation support information (e.g., the indicator) indicating the total number of images to be displayed and the display image positions (positions in a display order) in all the images are displayed when the group image display mode has been set has been shown. Here, the images to be displayed, and manipulation support information (e.g., tilt meter) indicating the relationship between the posture of the imaging device 100 and image advancing or image returning may be displayed. Further, such manipulation support information (e.g., the indicator and the tilt meter) may be simultaneously displayed. Hereinafter, an example in which the manipulation support information (tilt meter) the relationship between the posture of the imaging device 100 and image advancing or image returning is displayed will be described.
In
For example, a state 611 shown in the top of
For example, as in states 612 and 613 shown in
Subsequently, when the rotation angle (theta) at which the imaging device 100 is rotated to the right reaches 20 degrees, all the portions at the right from the middle of the tilt meter are black as shown in a state 605, and an indication of reaching the tilt of image advancing is displayed and image advancing is performed.
After image advancing is performed and a next image is displayed on the input/output panel 101, only the rectangular area in the middle portion of the tilt meter becomes black as shown in state 606. That is, the tilt meter in the state 606 becomes the same as the state in which the imaging device 100 exists on the same plane as the horizontal plane 330 (state 601).
Similarly, when the user rotates the imaging device 100 to the right rotation in a state in which the user holds the imaging device 100, the rectangular area in the tilt meter becomes black according to the rotation angle (theta), as shown in states 607 and 608 of
Here, when the imaging device 100 is rotated to the right to an angle at which the last image among the images to be displayed is sent, the tilt meter enters a state that is the same as the state 605. Accordingly, even when further rotation to the right is performed, an indication indicating that image advancing is not performed may be notified of.
Similarly, when the imaging device 100 is rotated to the left, the rectangular areas at the left from the middle of the tilt meter become sequentially black.
While the example in which image advancing or image returning is performed each time the imaging device 100 exceeds the reference angle (e.g., 20 degrees) has been shown in the foregoing, image advancing or image returning may be performed by another manipulation method. Hereinafter, an example in which image advancing or image returning is performed by another manipulation method is shown.
In
Subsequently, when the rotation angle (theta) at which the imaging device 100 is rotated to the right reaches 20 degrees, all right portions from a middle of the tilt meter become black as shown in the state 625, an indication of reaching the tilt of image advancing is displayed, and image advancing is performed.
In
When the rotation angle (theta) at which the imaging device 100 is rotated to the right reaches 20 degrees and then the rotation angle (theta) is maintained, image advancing is performed, and after a next image is displayed on the input/output panel 101, the state transitions to the state 626. This state 626 is a state in which the state 625 is maintained. That is, the tilt meter in the state 626 becomes the same as the state (state 625) in which the rotation angle (theta) at which the imaging device 100 is rotated to the right reaches 20 degrees. Further, in the state in which the rotation angle (theta) is maintained, image advancing is performed at certain intervals.
In
When the rotation angle (theta) at which the imaging device 100 is rotated to the right reaches 20 degrees and then the rotation angle (theta) is returned (i.e., when the rotation angle (theta) is less than 20 degrees), image advancing is performed, a next image is displayed on the input/output panel 101, and the state transitions to state 627. That is, when the rotation angle (theta) at which the imaging device 100 is rotated to the right reaches 20 degrees and then the rotation angle (theta) is returned, the display of the tilt meter is returned according to the returned angle and image advancing is ceased.
Similarly, when the imaging device 100 is rotated to the left, the rectangular areas at the left from the middle of the tilt meter become sequentially black.
In the foregoing, an example in which image advancing or image returning is performed based on whether the rotation angle of the imaging device 100 exceeds a reference angle of image advancing or image returning has been shown. Here, for example, image advancing or image returning may be performed based on whether the rotational angular velocity of the imaging device 100 exceeds a reference. Hereinafter, an example in which image advancing or image returning is performed based on whether the rotational angular velocity of the imaging device 100 exceeds a reference will be described.
In
In
In
In
Similarly, when the imaging device 100 is rotated to the left, the rectangular areas at the left from the middle of the tilt meter become sequentially black.
In
In
First, the case in which the manipulation support information (the indicator) superimposed on the image to be displayed is displayed on the input/output panel 101 will be described with reference to
Here, a case in which the image 652 is displayed on the input/output panel 101 when the group image display mode has been set is assumed. In this case, since the image 652 is a first image in a display order among the seven images in the same group, a rectangular area at a left end in the indicator 660 has a different display aspect from other rectangular areas. In this case, it is difficult to perform further image returning.
Further, a case in which an image 653 is displayed on the input/output panel 101 when the group image display mode has been set is assumed. In this case, since the image 653 is a last image in the display order among the seven images in the same group, a rectangular area at a right end in the indicator 660 has a different display aspect from the other rectangular areas. In this case, it is difficult to perform further image advancing.
Thus, a mark indicating order relating to the images displayed on the input/output panel 101 in all of the plurality of images (e.g., the order specified by the normalization angle information) can be displayed as the indicator.
It is possible to easily recognize positions in a display order of images to be displayed in the group by displaying the images to be displayed and the indicator 660 when the group image display mode is set. Accordingly, it is possible to easily perform the image advancing or image returning manipulation while viewing the images to be displayed when performing the image advancing or image returning manipulation.
Next, a case in which the manipulation support information (the tilt meter) superimposed on the image to be displayed is displayed on the input/output panel 101 will be described with reference to
Here, a case in which an image 652 is displayed on the input/output panel 101 when the group image display mode has been set is assumed. In this case, since the image 652 is a first image in the display order among the seven images in the same group, the rectangular areas from a middle of the tilt meter 670 to a left end have a different display aspect from the other rectangular areas. In this case, it is difficult to perform further image returning.
Further, a case in which an image 653 is displayed on the input/output panel 101 when the group image display mode has been set is assumed. In this case, since the image 653 is a last image in the display order among the seven images in the same group, the rectangular areas from the middle of the tilt meter 670 to a right end have a different display aspect from the other rectangular areas. In this case, it is difficult to perform further image advancing.
Thus, it is possible to display the mark indicating the change amount of the posture of the imaging device 100 necessary for changing the display state in the input/output panel 101, as the tilt meter. That is, it is possible to display the mark indicating the change amount of the posture of the imaging device 100 necessary for performing image advancing or image returning on a plurality of images, as the tilt meter.
Since the transition of the display state of the tilt meter 670 according to the tilt angle of the imaging device 100 is the same as the transition examples of
Thus, the user can easily recognize the tilt to display each image to be displayed by displaying the images to be displayed and the tilt meter 670 when the group image display mode has been set. Further, the user can easily recognize the positions in the display order of the images to be displayed in the group. Accordingly, the user can easily perform the image advancing or image returning manipulation while viewing the images to be displayed when performing the image advancing or image returning manipulation.
First, a determination is made as to whether an image content display instruction manipulation is performed (step S901), and when the display instruction manipulation is not performed, monitoring is continued. On the other hand, when the display instruction manipulation is performed (step S901), the control unit 140 sets the representative image display mode and the representative image reproduction unit 150 displays a representative image and non-grouped images on the display unit 111 (step S902).
Subsequently, a determination is made as to whether the image advancing manipulation or image returning manipulation is performed in a state in which the representative image display mode has been set (step S903). When the image advancing manipulation or the image returning manipulation is performed (step S903), the representative image reproduction unit 150 performs display switching of the image displayed on the display unit 111 (step S904). That is, image advancing or image returning of the image displayed on the display unit 111 is performed.
When the image advancing manipulation or image returning manipulation is not performed (step S903), a determination is made as to whether the group image display mode setting manipulation is performed (step S905), and when the group image display mode setting manipulation is not performed, the process proceeds to step S916. On the other hand, when the group image display mode setting manipulation is performed (step S905), a group image display mode setting process is performed (step S920). The group image display mode setting process will be explained in detail with reference to
Subsequently, the group image reproduction unit 160 displays initial manipulation support information (e.g., the manipulation support information 403 shown in
Subsequently, a determination is made as to whether the initial manipulation support information is displayed (step S907), and when the initial manipulation support information is not displayed, the process proceeds to step S910. On the other hand, when the initial manipulation support information is displayed (step S907), a determination is made as to whether a certain time elapses after the group image display mode is set (step S908). When a certain time elapses after the group image display mode is set (step S908), the group image reproduction unit 160 deletes the initial manipulation support information displayed on the display unit 111 (step S909). On the other hand, when a certain time has not elapsed after the group image display mode is set (step S908), a determination is made as to whether a tilt manipulation by the user is performed (step S910). That is, the group image reproduction unit 160 determines whether the posture of the imaging device 100 is changed above a certain amount based on the analysis information output from the analysis unit 130.
When the tilt manipulation by the user is performed (step S910), a determination is made as to whether the initial manipulation support information is displayed (step S911), and when the initial manipulation support information is not displayed, the process proceeds to step S913. On the other hand, when the initial manipulation support information is displayed (step S911), the group image reproduction unit 160 deletes the initial manipulation support information displayed on the display unit 111 (step S912).
Subsequently, the group image reproduction unit 160 changes the display state of the manipulation support information (the indicator or the tilt meter) based on the manipulation amount according to the tilt manipulation by the user and the display image (the image displayed on the display unit 111) (step S913). Subsequently, when the manipulation amount according to the tilt manipulation by the user exceeds a display switching reference, the group image reproduction unit 160 performs display switching (image advancing or image returning) on the image displayed on the display unit 111 (step S914), and the process returns to step S907. When the manipulation amount according to the tilt manipulation by the user does not exceed the display switching reference, the group image reproduction unit 160 does not perform display switching (image advancing or image returning) on the image displayed on the display unit 111 and the process returns to step S907.
When a tilt manipulation by the user is not performed (step S910), a determination is made as to whether the representative image display mode setting manipulation is performed (step S915), and when the setting manipulation is performed, the process returns to step S902. On the other hand, when the representative image display mode setting manipulation is not performed (step S915), a determination is made as to whether the image content display termination manipulation is performed (step S916), and when the display termination manipulation is performed, the image content reproduction process operation is terminated. When the image content display termination manipulation is not performed (step S916), a determination is made as to whether the group image display mode has been set (step S917). When the group image display mode has been set (step S917), the process returns to step S907, and when the group image display mode has not been set (that is, the representative image display mode has been set), the process returns to step S903. Steps S906 to S920 are one example of a control procedure defined in the claims. Step S910 is one example of a detection procedure defined in the claims.
First, the group image reproduction unit 160 acquires each image content belonging to a group corresponding to the representative image displayed on the display unit 111 upon a group image display mode setting manipulation from the image content storage unit 200 (step S921). Further, the group image reproduction unit 160 acquires image management information associated with each image content belonging to the group from the image management information storage unit 210 (step S921).
Subsequently, the manipulation method determination unit 180 determines whether angle information is included in the acquired image management information (step S922). When the angle information is included in the acquired image management information (step S922), the manipulation method determination unit 180 normalizes the angle information included in the acquired image management information for each image content (step S923).
Subsequently, the manipulation method determination unit 180 associates the acquired image content with the tilt angle (the reference angle) based on the normalized angle information (normalization angle information) (step S924). In this case, when the number of the acquired image contents is greater than the number of tilt angles (reference angles), the manipulation method determination unit 180 performs an interleaving process on the acquired image contents so that the number of image contents is in a range of the number of tilt angles (reference angles). Subsequently, the manipulation method determination unit 180 calculates the display area of the indicator based on a correspondence relationship resulting from the association (step S925).
Further, when the angle information is not included in the acquired image management information (step S922), the manipulation method determination unit 180 determines whether the number of the acquired image contents is greater than the number of images that can be displayed (the number of tilt angles (reference angles)) (step S926). When the number of acquired image contents is equal to or less than the number of images that can be displayed (step S926), the process proceeds to step S928. On the other hand, when the number of the acquired image contents is greater than the number of images that can be displayed (step S926), the manipulation method determination unit 180 performs an interleaving process on the acquired image contents so that the number of images is in a range of the number of images that can be displayed (step S927).
Subsequently, the manipulation method determination unit 180 associates the acquired image contents (or the image contents after the interleaving process) with the tilt angles (reference angles) (step S928). The manipulation method determination unit 180 then calculates the display area of the indicator based on a correspondence relationship resulting from the association (step S929).
In the foregoing, the example in which the manipulation support information (the tilt meter or the indicator) structured by a rod-shaped rectangle is displayed as the manipulation support information has been shown. In the variant, an example of the manipulation support information (the tilt meter or the indicator) displayed with another structure is shown. The configuration of the imaging device 100 is the same as the first embodiment of the present invention except that the display aspect of the manipulation support information differs. Accordingly, hereinafter, a difference with the first embodiment of the present invention will be mainly described and a description of the same portions will be omitted.
In
In
Further, in the example shown in
For example, a case in which the user rotates the imaging device 100 to the left in a state in which the user holds the imaging device 100 is assumed. For example, when an angle at which the imaging device 100 is rotated to the left reaches 20 degrees, the tilt meter 712 with an arrow from the center of the circle constituting the tilt meter to a position on the circle corresponding to the rotation angle (20 degrees) is displayed on the image 701. Here, in this case, image advancing is not performed. Further, a case in which the user rotates the imaging device 100 to the left in a state in which the user holds the imaging device 100 and the rotation angle reaches 30 degrees is assumed. In this case, the tilt meter with an arrow from the center of the circle constituting the tilt meter to a position on the circle (a bottom of the circle) corresponding to the rotation angle (30 degrees) is displayed on the image 701, an indication of reaching a tilt of image advancing is displayed, and image advancing is performed.
Similarly, when the imaging device 100 is rotated to the right, a tilt meter 713 with an arrow from the center of the circle constituting the tilt meter to the position on the circle corresponding to the rotation angle is superimposed on the image 701 and displayed.
In
In
Further, in the example shown in
For example, a case in which the user rotates the imaging device 100 to the left in a state in which the user holds the imaging device 100 and the rotational angular velocity reaches 20 degrees/sec is assumed. In this case, a tilt meter 721 with an arrow from the center of the circle constituting the tilt meter to a position on the circle corresponding to the rotational angular velocity (20 degrees/sec) is displayed on the image 701, an indication of reaching a rotational angular velocity of image advancing is displayed, and image advancing is performed. The image 702 is displayed by image advancing. Further, when a state of the angle is maintained after image advancing (i.e., in the case of 0 degree/sec), the original state is returned. That is, a tilt meter 722 with an arrow from the center of the circle constituting the tilt meter to a position on the circle (a top of the circle) corresponding to the rotational angular velocity (0 degree/sec) is displayed on the image 702.
Similarly, when the imaging device 100 is rotated to the right, a tilt meter with an arrow from the center of the circle constituting the tilt meter to the position on the circle corresponding to the rotational angular velocity is displayed.
In the tilt meter shown in
In
In
Since a display state in the indicator shown in
Thus, based on the correlativity (e.g., corresponding to the group type) between the images in the image group associated with the image displayed on the input/output panel 101, a display aspect of the indicator is changed from the image group to the image group and then the indicator is displayed.
Further, the user can easily perform a manipulation to display each image to be displayed by displaying the image to be displayed and the display support information indicating the type of the group to be displayed when the group image display mode has been set. That is, when image advancing or image returning manipulation is performed, the user can easily perform the manipulation while viewing the image to be displayed.
As described above, in the embodiment of the present invention, the initial manipulation support information (e.g., the manipulation support information 403 shown in
Further, for example, the user can easily recognize a tilt angle at which a view is switched by displaying the tilt meter when the multi-view image is displayed upon setting of the group image display mode. Further, for example, even when an image in which a change of a switched view is small is displayed, the user can easily recognize that the image is switched by the user performing a manipulation to tilt the device.
Further, for example, the user can easily recognize one of all views to which a currently displayed view corresponds by displaying the indicator when the multi-view image is displayed upon setting of the group image display mode. Further, for example, when an image of a view photographed from a rightmost side is displayed, the user can easily recognize that the view is not switched even though a right side is further tilted down.
In the example shown above, the example in which the display state in the input/output panel 101 is changed by rotating the imaging device 100 about the vertical direction of the imaging device 100 has been shown. Here, the embodiment of the present invention may be applied to a case in which the display state in the input/output panel 101 is changed by rotating the imaging device 100 about the horizontal direction of the imaging device 100. In this case, for example, the image may be changed by up and down tilt manipulations, and the indicator or the tilt meter may be vertically arranged and displayed. Further, for example, when an image in which there is a view in both directions of up and down and left and right is displayed, view switching may be performed by a tilt manipulation in both directions of up and down and left and right, and the indicator or the tilt meter may be displayed for both directions of up and down and left and right. The embodiment of the present invention may also be applied to a case in which the display state in the input/output panel 101 is changed by sliding (parallel movement) toward any of up, down, left, right, backward and forward directions of the imaging device 100.
While the imaging device has been described by way of example in the embodiments of the present invention, the embodiments of the present invention may be applied to an image processing device capable of displaying image contents stored in the recording medium on the display unit. For example, the embodiments of the present invention may be applied to image processing devices, such as a mobile phone, a navigation system, and a portable media player with an imaging function.
The embodiment of the present invention illustrates one example for embodying the present invention, and the matters in the embodiment of the present invention and the specified matters of the invention in the claims have a correspondence relationship, as described in the embodiment of the present invention. Similarly, the specified matters of the invention in the claims and the matters in the embodiment of the present invention having the same names as the specified matters have a correspondence relationship. Here, the present invention is not limited to the embodiments, and various variations may be made to the embodiments without departing from the spirit and scope of the present invention.
Further, the processing procedure described in the embodiments of the present invention may be regarded as a method including a series of procedures or as a program for causing a computer to execute a series of procedures or a recording medium having the program stored thereon. The recording medium may be, for example, a compact disc (CD), a mini disc (MD), a digital versatile disc (DVD), a memory card, or a Blu-ray disc (registered trademark).
Patent | Priority | Assignee | Title |
Patent | Priority | Assignee | Title |
7859516, | Oct 29 2007 | TOSHIBA CLIENT SOLUTIONS CO , LTD | Information processing apparatus, scroll control apparatus, scroll control method, and computer program product |
20060281453, | |||
20070290999, | |||
20080042973, | |||
20080292212, | |||
20090086047, | |||
20090262074, | |||
20100066666, | |||
20110126156, | |||
20110157231, | |||
CN101082837, | |||
CN101589354, | |||
JP11305918, | |||
JP2005221816, | |||
JP2006065368, | |||
JP2009110178, | |||
JP2010009575, | |||
JP6004208, | |||
WO2009145335, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Mar 10 2011 | Sony Corporation | (assignment on the face of the patent) | / |
Date | Maintenance Fee Events |
Date | Maintenance Schedule |
Mar 03 2018 | 4 years fee payment window open |
Sep 03 2018 | 6 months grace period start (w surcharge) |
Mar 03 2019 | patent expiry (for year 4) |
Mar 03 2021 | 2 years to revive unintentionally abandoned end. (for year 4) |
Mar 03 2022 | 8 years fee payment window open |
Sep 03 2022 | 6 months grace period start (w surcharge) |
Mar 03 2023 | patent expiry (for year 8) |
Mar 03 2025 | 2 years to revive unintentionally abandoned end. (for year 8) |
Mar 03 2026 | 12 years fee payment window open |
Sep 03 2026 | 6 months grace period start (w surcharge) |
Mar 03 2027 | patent expiry (for year 12) |
Mar 03 2029 | 2 years to revive unintentionally abandoned end. (for year 12) |