An electronic device including an optical module, a method of operating the electronic device including the optical module, and a non-transitory computer-readable recording medium having recorded thereon a program for performing the method. The electronic device includes an optical module configured to project content on a projection surface and a processor configured to determine whether the electronic device is positioned within a predetermined range of a projection surface and control the optical module to project the content onto the projection surface based on the determination.
|
17. A method of operating an electronic device having a projector to project content onto a projection surface with which the electronic device is in contact, the method comprising:
determining whether the electronic device is in contact with the projection surface on which projected content is displayed by determining a distance between the electronic device and the projection surface,
in response to the determination that the distance between the electronic device and the projection surface is greater than zero such that the electronic device is not in contact with the projection surface, controlling the projector not to project content, and
in response to the determination that the distance between the electronic device and the projection surface is zero such that the electronic device is in contact with the projection surface, controlling the projector to project an indicator to inform a user of a position of a projection image to be output on the projection surface.
1. An electronic device comprising:
a memory configured to store instructions;
a processor configured to execute the stored instructions; and
a projector configured to project content onto a projection surface,
wherein the processor is further configured to:
determine whether the electronic device is in contact with the projection surface on which projected content is displayed by determining a distance between the electronic device and the projection surface,
in response to the determination that the distance between the electronic device and the projection surface is greater than zero such that the electronic device is not in contact with the projection surface, control the projector not to project content, and
in response to the determination that the distance between the electronic device and the projection surface is zero such that the electronic device is in contact with the projection surface, control the projector to project an indicator to inform a user of a position of a projection image to be output on the projection surface.
2. The electronic device of
determine whether the electronic device is in contact with a wall,
in response to determining that the electronic device is in contact with the wall, determine whether a projection image to be output from the projector is parallel to a ground surface, and
in response to the determination that the projection image from the projector is not parallel to the ground surface, output an indicator for providing a guide to position the electronic device such that the projection image from the projector is parallel to the ground surface.
3. The electronic device of
a sound that instructs to position the electronic device in a designated position;
a projection image comprising a message that instructs to position the electronic device in the designated position;
a color or a blinking number of a LED lamp;
a laser light guide; and
a vibration of a vibration device included in the electronic device.
4. The electronic device of
5. The electronic device of
6. The electronic device of
wherein the processor is further configured to control the hinge structure to position the electronic device on the projection surface based on rotation information of the electronic device.
7. The electronic device of
8. The electronic device of
9. The electronic device of
10. The electronic device of
11. The electronic device of
12. The electronic device of
13. The electronic device of
14. The electronic device of
15. The electronic device of
16. The electronic device of
18. A non-transitory computer-readable recording medium having recorded thereon a program for performing the method of
|
This application claims priority from Korean Patent Application No. 10-2015-0056896, filed on Apr. 22, 2015, in the Korean Intellectual Property Office, the disclosure of which is incorporated herein by reference in its entirety.
1. Field
Methods and apparatuses consistent with exemplary embodiments relate to an electronic device and a method, and more particularly, to an electronic device including an optical module and a method of operating the electronic device.
2. Description of the Related Art
A projector or a projection system is a display device that projects input image signals on a screen using light emitted from a light source. The light source may include, for example, an LED or a lamp. The light source may display a picture image. Such a display device may be used to implement a conference room presentation, a motion-picture projector, a home theater, etc.
According to an aspect of an exemplary embodiment, there is provided an electronic device including: an optical module configured to project content on a projection surface; and a processor configured to determine whether the electronic device is positioned within a predetermined range of the projection surface and control the optical module to project the content onto the projection surface based on the determination.
The processor may be further configured to output a guide for indicating a position of a projection image to be output from the optical module.
The processor may be further configured to, in response to the electronic device not being positioned on the projection surface, output an indicator for providing a guide to position the electronic device on the projection surface.
The indicator may include at least one among: a sound that instructs to position the electronic device in a designated position; a projection image including a message that instructs to position the electronic device in the designated position; a color or a blinking number of a LED lamp; a laser light guide; and a vibration of a vibration device included in the electronic device.
The processor may be further configured to detect another electronic device near the electronic device and, in response to determining that the electronic device is not positioned on a level with the other electronic device, output an indicator for instructing to position the electronic device on a level with the other electronic device.
The indicator may include at least one of sound, light, vibration, and an image.
The electronic device may include a hinge structure configured to rotate the projection module, wherein the processor may be further configured to control the hinge structure to position the electronic device on the projection surface based on rotation information of the electronic device.
The processor may be further configured to adjust at least one of geometric information and color information of an image based on sensed direction information of the electronic device.
The processor may be further configured to, in response to an angle between the projection surface and the ground surface being greater than or equal to a predetermined value, adjust the projection image.
The processor may be further configured to, in response to an angle between the projection surface and a ground surface being less than a predetermined value, adjust a projection image.
The processor may be further configured to adjust the projection image according to whether the projection surface of the electronic device is orthogonal to a ground surface or parallel to the ground surface.
The processor may be further configured to, in response to a space onto which an image is to be projected from the electronic device being a smaller size than the projection surface, adjust the size of the projection image.
The processor may be further configured to determine an expandability of the projection image according to whether there is at least one other electronic device near the electronic device.
The processor may be further configured to determine whether there is at least one other electronic device near the electronic device according to at least one of a signal intensity of wireless communication between the electronic device and another electronic device and a relative distance between the electronic device and another electronic device measured with a sensor.
The processor may be further configured to, in response to determining the expandability of the projection image, divide the projection image into at least one other image and transmit the at least one other image to the at least one other electronic device.
The processor may be further configured to sense a user input and output a projection image in which an object is moved based on the user input.
The processor may be further configured to sense a user input and transmit information to move an object in the projection image to a projection image corresponding to another electronic device based on the user input.
According to an aspect of another exemplary embodiment, there is provided a method of operating an electronic device including an optical module, the method including: determining whether the electronic device is positioned within a predetermined range of a projection surface; and projecting content onto the projection surface using the optical module based on the determination.
The method may include outputting a guide for indicating a position of a projection image to be output from the optical module.
According to an aspect of another exemplary embodiment, there is provided a non-transitory computer-readable recording medium having recorded thereon a program for performing a method of operating an electronic device including an optical module, the method including: determining whether the electronic device is positioned within a predetermined range of a projection surface; and projecting content onto the projection surface using the optical module based on the determination.
According to an aspect of another exemplary embodiment, there is provided a method of projecting an image on a projection surface, the method including: determining whether an electronic device is positioned on the projection surface; in response to the electronic device not being positioned on the projection surface, instructing a user to position the electronic device on the projection surface, and in response to the electronic device being positioned on the projection surface, projecting the image on the surface by using an optical module.
The determining whether the electronic device is positioned on the projection surface may include determining whether the electronic device is positioned on a surface that is parallel to a ground surface or orthogonal to a ground surface.
The instructing a user to position the electronic device may include projecting light onto the projection surface indicating at least one among an outline of the image and corners of the image.
The method may further include detecting another electronic device near the electronic device and, in response to determining that the electronic device is not positioned on a level with the other electronic device, outputting an indicator for instructing to position the electronic device on a level with the other electronic device.
The method may further include sensing a user input and outputting an image in which an object is moved based on the user input.
These and/or other aspects will become apparent and more readily appreciated from the following description of the exemplary embodiments, taken in conjunction with the accompanying drawings in which:
Exemplary embodiments will be described below in detail with reference to the accompanying drawings. Further, a method of configuring and using an electronic device according to an exemplary embodiment will be described in detail with reference to the accompanying drawings. Like reference numerals refer to like elements throughout. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items. Expressions such as “at least one of,” when preceding a list of elements, modify the entire list of elements and do not modify the individual elements of the list.
Although the terms “first,” “second,” etc., may be used herein to describe various elements, these elements should not be limited by these terms. The above terms are used to distinguish one component from another. For example, a first element may be called a second element, and a second element may be called a first element
The terminology used herein is for describing one or more exemplary embodiments, and is not intended to limit the scope of the present disclosure. An expression used in the singular encompasses the expression in the plural, unless it has a clearly different meaning in the context. It will be further understood that the terms “comprises,” “comprising,” “includes,” and/or “including” when used herein, specify the presence of stated features, integers, steps, operations, elements, components, and/or combinations thereof, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or combinations thereof.
An electronic device 100 according to an exemplary embodiment may be used while positioned on a projection surface using a projector lens mirror system.
The electronic device 100 according to an exemplary embodiment may be used while positioned on a table as shown in
Referring to
Referring to
The electronic device according to an exemplary embodiment may determine whether the electronic device is positioned on the projection surface or whether the electronic device is positioned within a predetermined distance from the projection surface and may project content according to a result of the determination.
The electronic device according to an exemplary embodiment may analyze the position of the electronic device and thus may output an indicator for guiding a position of a projection image.
The electronic device according to an exemplary embodiment may determine whether the electronic device is positioned on the projection surface and may provide a guide to a user to position the electronic device on the projection surface according to a result of the determination.
According to an exemplary embodiment, the projection image may be expanded by positioning a plurality of electronic devices including respective projection modules adjacent to one another and projecting content through the plurality of electronic devices together.
Referring to
The electronic device 100 according to an exemplary embodiment includes a projection module 162 (e.g., projector), a processor 110, and a communication module 120 (e.g., communicator).
The electronic device 200 according to an exemplary embodiment includes a projection module 262, a processor 210, and a communication module 220.
The projection module 162 projects content onto a projection surface by the control of the processor 110. According to an exemplary embodiment, the projection module may be referred to as an optical module.
The communication module 120 communicates, through the network 300, with the external device 400 connected with the electronic device 200 and the electronic device 100 which include projections modules and is configured to transmit content to the electronic device 100 by the control of the processor 110.
The processor 110 according to an exemplary embodiment may determine whether the electronic device 100 is positioned on the projection surface or whether the electronic device is positioned within a predetermined distance from the projection surface and may determine whether to project the content on the basis of the determination.
The processor 110 according to an exemplary embodiment may determine whether the electronic device 100 is positioned on the projection surface and may output an indicator for instructing to position the electronic device 100 on the projection surface.
The processor 110 may recognize the presence of the electronic device 200 positioned adjacent to the electronic device 100 and may determine the expandability of a projection image.
The electronic device 100 and the electronic device 200 may communicate with the external device 400 through the network 300. The electronic device 100 may project content received from the external device 400 through the network using the projection module 162. The electronic device 200 may project content received from the external device 400 through the network using the projection module 262.
The electronic device 100 may recognize the electronic device 200 positioned in the vicinity of the electronic device 100 (e.g., within a predetermined distance). When the electronic device 100 recognizes the electronic device 200, the electronic device 100 and the electronic device 200 may communicate through the network 300. When the electronic device 100 recognizes the electronic device 200, the electronic device 100 may transmit at least some of the content received from the external device 400 to the electronic device 200 to expand the projection image.
In response to a user input for an object displayed in the projection image, the electronic device 100 may transmit information regarding the object to the electronic device 200 through the network 300.
Referring to
The processor 110 may run an operating system or an application program to control a plurality of hardware and/or software elements connected to the processor 110 and may perform processing and operations of various types of data including multimedia data. The processor 110, for example, may be implemented as a system-on-chip (SoC). According to an exemplary embodiment, the processor 110 may further include a graphic processing unit (GPU).
The communication module 120 may perform data transmission/reception in communication with other electronic devices, for example, other electronic devices or servers connected to the electronic device 100 through a network. According to an exemplary embodiment, the communication module 120 may include a Wi-Fi module 121, a Bluetooth (BT) module 122, and a radio frequency (RF) module 123.
For example, each of the Wi-Fi module 121 and the BT module 122 may include a processor that processes data transmitted or received through a corresponding module. The Wi-Fi module 121 and the BT module 122 are shown as separate blocks in
The RF module 123 may perform data transmission/reception, for example, RF signal transmission/reception. The RF module 123 may include, for example, a transceiver, a power amp module (PAM), a frequency filter, a low noise amplifier (LNA), and the like. The RF module 123 may further include a component for transmitting/receiving an electromagnetic wave over the air in wireless communication, such as a conductor or a conducting wire. Although
The memory 130 may include an internal memory 131. The internal memory 131 may include, for example, at least one of a volatile memory (e.g., a dynamic random access memory (DRAM), a static RAM (SRAM), or a synchronous dynamic RAM (SDRAM)) and a non-volatile memory (e.g., a one-time programmable read only memory (OTPROM), a programmable ROM (PROM), an erasable and programmable ROM (EPROM), an electrically erasable and programmable ROM (EEPROM), a mask ROM, a flash ROM, a NAND flash memory, or a NOR flash memory).
The memory 130 may store various types of data, programs, and/or applications that drive and control the electronic device 100 by the control of the processor. The memory 130 may store signals and/or data that is input or output corresponding to the driving of the one or more processors 110, the communication module 120, the sensor module 140, the input device 150, the optical module 160, the interface 170, the audio module 180, the camera module 191, the indicator 192, the motor 193, the power management module 194, the battery 195, and the wireless charging module 196.
The sensor module 140 may measure a physical quantity or sense an operation state of the electronic device 100 and may convert the measured or sensed information into an electrical signal. The sensor module 140 may include, for example, at least one of a gesture sensor 140A, a gyro sensor 140B, an acceleration sensor 140C, an ultrasonic sensor 140D, an infrared sensor 140E, a Hall sensor 140F, a proximity sensor 140G, and an illumination sensor 140H. Additionally or alternatively, the sensor module 140 may include, for example, an E-nose sensor, an electromyography (EMG) sensor, an electroencephalogram (EEG) sensor, an electrocardiogram (ECG) sensor, an iris sensor, a fingerprint sensor, a pressure sensor, etc. The sensor module 140 may further include a control circuit that controls one or more sensors included therein.
According to an exemplary embodiment, the sensor module 140 may use at least one sensor included in the sensor module 140 to detect whether the electronic device is positioned on a projection surface, whether the electronic device is positioned within a predetermined distance from the projection surface, whether the electronic device is positioned horizontally to the ground surface, or whether the electronic device is adjacent to another device.
The input device 150 may include a key 151. The key 151 may include, for example, a physical button, an optical key, and/or a keypad. According to an exemplary embodiment, the electronic device 100 may use the communication module 120 to receive a user input from an external device (e.g., a computer or a server) connected with the communication module 120.
The optical module 160 may include an illumination module 161 and a projection module 162 (e.g., projector). The projection module 162 may project light onto a screen and display an image. For example, the screen may be positioned inside or outside the electronic device 100.
According to an exemplary embodiment, when the electronic device 100 is positioned on the projection surface or within a predetermined distance from the projection surface, the optical module 160 may project content.
According to an exemplary embodiment, when the electronic device 100 is not positioned on the projection surface or within a predetermined distance from the projection surface, the power management module 194 may block power to the optical module 160 and thus prevent the optical module 160 from operating.
A scheme in which the projection module 162 projects light includes a digital light processing (DLP) scheme, a liquid crystal on silicon (LCOS) scheme, a 3LCD scheme, an LCD scheme, a laser scheme, and the like.
The DLP scheme refers to a projection display scheme using a digital micro-mirror device (DMD), which is one of screen display elements. The liquid crystal on silicon (LCOS) scheme may also include performing projection using an LCOS panel that performs displaying by defining a pixel by a plurality of scan lines and data lines, including a crystal having a predetermined molecule arrangement, and transmitting and reflecting light input from the outside through the crystal. The 3LCD scheme includes a liquid crystal display to which lamp light is transmitted and which is divided into three parts. The 3LCD uses red, blue, and green into which each color was divided before light originating from the lamp is enlarged by a lens through an LCD panel. A projector may also be implemented in the 3LCD scheme. In addition, like an LCD, the projector using one LCD panel may also be provided. In addition, the laser scheme may include a light source composed of a red light emitting device, a green light emitting device, and a blue light emitting device, an optical tunnel to which laser light emitted from the light source is incident, and a display device configured to project an image onto a screen using the laser light incident through the optical tunnel. The laser scheme may also include, as a projection module, a structure including a synthesis module for performing synthesis by transmitting or reflecting some colors of the laser light emitted from the light source and a speckle removal unit for removing speckle by irregularly changing a phase of the laser light synthesized through the synthesis module.
The interface 170 may include, for example, a high-definition multimedia interface (HDMI) 171 and a Universal Serial Bus (USB) 172. The interface 170 may be included in, for example, the communication module 120 shown in
The audio module 180 may bi-directionally convert a sound and an electrical signal. The audio module 180 may process sound information input or output through, for example, a speaker 181 and/or a microphone 182.
The electronic device 100 may also transmit audio to an external BT device using the BT module 122 included in the communication module 120, instead of including the audio module 180.
The camera module 191 may be a device for capturing a still image and a moving image, and according to an exemplary embodiment, may include one or more image sensors (e.g., a front sensor and/or a rear sensor), a lens, an image signal processor (ISP,), or a flash (for example, a light emitting diode (LED) or a xenon lamp.
The camera module 191 receives an image (e.g., consecutive frames) corresponding to a user's motion including a gesture in a camera recognition range. For example, the recognition range of the camera module 191 may be a distance of about 0.1 meters to about 5 meters from the camera unit to the user. For example, the user's motion may include a body part of a user, such as the face, facial expression, hand, fist, or finger of the user, or a motion of the body part of the user.
The indicator 192 may indicate a state of the electronic device 100 or a component (e.g., the processor 110) thereof, for example, a booting state, a message state, and/or a charging state.
The motor 193 may convert an electrical signal into mechanical vibration. The electronic device 100 may include a processing device (e.g., a graphic processing unit (GPU)) for supporting a mobile TV. The processing device for supporting the mobile TV may process media data according to a standard such as digital multimedia broadcasting (DMB), digital video broadcasting (DVB), and/or media flow.
The power management module 194 may manage power of the electronic device 100. The power management module 194 may include, for example, a power management integrated circuit (PMIC), a charger integrated circuit (IC), and/or a battery or fuel gauge.
According to an exemplary embodiment, when the electronic device 100 is positioned on a projection surface or within a predetermined distance from the projection surface, the power management module 194 may apply power to the optical module 160 to operate the optical module 160.
According to an exemplary embodiment, when the electronic device 100 is not positioned on a projection surface or within a predetermined distance from the projection surface, the power management module 194 may shut off power to the optical module 160 to prevent the optical module 160 from projecting content.
The PMIC may be installed in, for example, an integrated circuit or an SoC semiconductor chip. The charging scheme may be classified into a wired connection and a wireless connection. The charger IC may charge a battery and prevent overvoltage or overcurrent originating from a charger. According to an exemplary embodiment, the charger IC may include a charger IC capable of performing at least one of a wired charging scheme and a wireless charging scheme.
The wireless charging module 196 may include a circuit capable of performing wireless charging, for example, a coil loop, a resonance circuit, or a rectifier. The wireless charging scheme includes, for example, a magnetic resonance scheme, a magnetic induction scheme, and electromagnetic wave scheme.
The battery gauge may measure, for example, a residual quantity of the battery 195 or a voltage, current, or temperature during the charging of the battery 195. The battery 195 may store and generate electricity and may supply power to the electronic device 100 using the stored or generated electricity. The battery 195 may include, for example, a rechargeable battery or a solar battery.
Each of the above-described elements of the electronic device according to one or more exemplary embodiments may include one or more components, and the name of a corresponding element may vary depending on the type of the electronic device. The electronic device according to one or more exemplary embodiments may be configured to include at least one of the above elements and may be configured to remove some elements or add additional other elements. Because some of the elements of the electronic device according to one or more exemplary embodiments are combined to one entity, functions of the elements may be performed the same as before combining.
Referring to
The operating system 131 controls overall operations of the electronic device 100.
The signal processing module 132 performs buffering and/or signal decryption so that content received through the communication module 120 may be seen with the optical module 160. The signal processing module 132 processes image data received by the electronic device 100. The signal processing module 132 may perform various image processing operations, such as decoding, scaling, noise filtering, frame rate conversion, and resolution conversion, on video data.
The device position determination module 133 determines whether the electronic device is positioned on the projection surface.
According to an exemplary embodiment, the device position determination module 133 may determine whether the electronic device is positioned on the projection surface or within a predetermined distance from the projection surface and may determine the projection of content according to the determination.
According to an exemplary embodiment, the device position determination module 133 may determine whether the projection surface of the electronic device is vertical or horizontal to the ground surface and may determine whether to output an indicator for instructing to position the electronic device on the projection surface according to the determination.
According to an exemplary embodiment, the device position determination module 133 may determine whether the electronic device is positioned on the projection surface, and may output an indicator for instructing to position the electronic device on the projection surface when it is determined that the electronic device is not positioned on the projection surface.
According to an exemplary embodiment, the device position determination module 133 may use a guidance sound and/or a mechanical sound that instructs to position the electronic device at a designated position as the indicator for instructing to position the electronic device on the projection surface by outputting the sound through the audio module 180.
According to an exemplary embodiment, the device position determination module 133 may use a projection image containing a guidance message that instructs to position the electronic device at a designated position as the indicator for instructing to position the electronic device on the projection surface. The projection image containing the guidance message may be output through the optical module 160.
According to an exemplary embodiment, the device position determination module 133 may use a color or a blinking number of an LED lamp or a laser light guide as an indicator for instructing to position the electronic device on the projection surface. The LED lamp or laser may be output through the indicator 192 of the electronic device 100.
According to an exemplary embodiment, the device position determination module 133 may use a vibration of a vibration device provided in the electronic device as an indicator for instructing to position the electronic device on the projection surface. The above-described vibration may be output through the motor 193.
According to an exemplary embodiment, the device position determination module 133 may determine whether the electronic device is positioned on a level with another adjacent electronic device and may output an indicator for instructing to position the electronic device or the other electronic device on a level with the other when it is determined that the electronic device is not positioned on a level with the other electronic device.
According to an exemplary embodiment, the device position determination module 133 may use a guidance sound or a mechanical sound that instructs to position the electronic device at the designated position as an indicator for instructing to position the electronic device or the other electronic device on a level with the other. The sound may be output through the audio module 180.
According to an exemplary embodiment, the device position determination module 133 may use a projection image containing a guidance message that instructs to position the electronic device at the designated position as an indicator for instructing to position the electronic device or the other electronic device on a level with the other. The projection image containing the guidance message may be output through the optical module 160.
According to an exemplary embodiment, the device position determination module 133 may use a color or a blinking number of an LED lamp or a laser light guide as an indicator for instructing to position the electronic device or the other electronic device level with the projection surface. A light source such as the LED lamp or the laser may be included in the indicator 192 of the electronic device 100.
According to an exemplary embodiment, the device position determination module 133 may perform control to rotate the electronic device such that the electronic device is positioned on the projection surface on the basis of rotation information of the electronic device. The rotation of the electronic device may use a hinge structure included in the electronic device.
The projection surface analysis module 134 corrects an image by analyzing a projector projection surface.
According to an exemplary embodiment, the projection surface analysis module 134 may sense direction information of the electronic device and may correct at least one of geometric information and color information of a projection image on the basis of the directional information of the sensed electronic device. The direction information of the electronic device may be sensed using at least one of the gyro sensor 140B and the acceleration sensor 140C included in the sensor module 140 of the electronic device.
According to an exemplary embodiment, the projection surface analysis module 134 may analyze the projection surface. The projection surface analysis module 134 may not correct a projection image when an angle between the projection surface and the ground surface is less than a predetermined value and may correct the projection image when the angle between the projection surface and the ground surface is equal to or greater than the predetermined value. The angle between the projection surface and the ground surface may be sensed using at least one of the gyro sensor 140B and the acceleration sensor 140C included in the sensor module 140.
According to an exemplary embodiment, the projection surface analysis module 134 may analyze the projection surface. The projection surface analysis module 134 may correct the projection image when the angle between the projection surface and the ground surface is less than the predetermined value and may not correct the projection image when the angle between the projection surface and the ground surface is equal to or greater than the predetermined value.
According to an exemplary embodiment, the projection surface analysis module 134 may determine whether to correct the projection image depending on whether the projection surface of the electronic device is horizontal (e.g., parallel) or vertical (e.g., orthogonal) to the ground surface. Whether the projection surface of the electronic device is horizontal or vertical to the ground surface may be sensed using at least one of the gyro sensor 140B and the acceleration sensor 140C included in the sensor module 140.
According to an exemplary embodiment, the projection surface analysis module 134 may not correct the projection image when the electronic device is positioned on a bottom surface, i.e., when the projection surface is horizontal to the ground surface and may correct the projection image when the electronic device is positioned on a wall, that is, the projection surface is vertical to the ground surface. When the electronic device is positioned on a bottom surface, there may be a reduced need to correct the projection image because the user may easily rotate the electronic device or adjust his or her position and thus have no difficulty in identifying a projection image that is output from the electronic device. However, if the electronic device is positioned on a wall, the need to correct the projection image may increase when the projection image is not output horizontally because the user may have difficulty in recognizing or identifying the projection image.
According to an exemplary embodiment, the projection surface analysis module 134 may correct the size of the projection image when the size of a space on which an image is to be projected is smaller than the size of the projection surface. The projection image expandability determination module 135 determines whether to expand and project the projection image.
According to an exemplary embodiment, the projection image expandability determination module 135 may determine whether there is at least one other electronic device in the vicinity of the electronic device and may determine the expandability of the projection image depending on whether there is at least one other electronic device in the vicinity of the electronic device. According to an exemplary embodiment, whether there is at least one other electronic device in the vicinity of the electronic device may be sensed using at least one of a signal intensity of wireless communication between an electronic device and another adjacent electronic device through the communication module 120 included in the electronic device 100 and a relative distance between the electronic device and the other electronic device measured through the infrared sensor 140E or the ultrasonic sensor 140D included in the sensor module 140.
According to an exemplary embodiment, when the projection image expandability determination module 135 determines the expandability of the projection image because there is at least one other electronic device in the vicinity of the electronic device, the projection image expandability determination module 135 may divide the projection image into at least one separate image and may transmit the at least one separate image to the at least one other electronic device.
The user input processing module 136 senses a user input in the output projection image and performs data processing in response to the sensing.
According to an exemplary embodiment, the user input processing module 136 may sense a user input that instructs to move an object in the projection image and may output the projection image in which the object is moved on the basis of the user input. The user input that instructs to move the object in the projection image may be sensed through the camera module 191 of the electronic device or through the ultrasonic sensor 140D or the infrared sensor 140E included in the sensor module 140.
According to an exemplary embodiment, the user input processing module 136 may sense a user input that instructs to move an object in the projection image onto a projection image corresponding to another electronic device and may transmit the object to the other electronic device on the basis of the user input.
According to an exemplary embodiment, the electronic device 100 may receive content to be projected through the communication module 120 from an external device. For example, the electronic device 100 may receive content in a wired or wireless manner from at least one of various external devices such as a flexible device 1, a watch-type device 2, a table PC 3, a mobile device 4, a display 5 such as a television, a notebook computer 6, a desktop computer 7, a glasses-shaped device 8, and a head-mount device. The content received from one of these external devices may be mirroring content and may be content different from content displayed on an external device that transmit the content.
According to an exemplary embodiment, the electronic device 100 may receive content to be projected through the interface 170 of the electronic device 100.
According to an exemplary embodiment, the electronic device 100 may acquire the content to be projected from the memory 130 of the electronic device 100.
According to one or more exemplary embodiments, the electronic device 100 may be solely used to project the projection image. However, the projection image may be expanded to various sizes such as about two times, three times, or four times the original size by using two or more electronic devices together.
Referring to
For example, the electronic device 100-1 is connected with an external device 51, and the electronic device 100-2 is connected with an external device 52. The electronic device 100-1 may transmit at least some data of screen A received from the external device 51 to the electronic device 100-2. The projection image may be expanded by the electronic device 100-1 and the electronic device 100-2 that mutually project screen A received from the external device 51, that is, by the electronic device 100-1 that projects portion A-1 of screen A received from the external device 51 and the electronic device 100-2 that projects portion A-2 of screen A received from the electronic device 100-1.
Referring to
For example, the electronic device 100-1 is connected with an external device 61, the electronic device 100-2 is connected with an external device 62, the electronic device 100-3 is connected with an external device 63, and the electronic device 100-4 is connected with an external device 64. The electronic device 100-1 may transmit at least some data of screen A received from the external device 61 to the electronic device 100-2, the electronic device 100-3, and the electronic device 100-4. The projection image may be expanded by the electronic device 100-1, the electronic device 100-2, the electronic device 100-3, and the electronic device 100-4 that mutually project screen A received from the external device 61, that is, by the electronic device 100-1 that projects portion A-1 of screen A, the electronic device 100-2 that projects portion A-2 of screen A, the electronic device 100-3 that projects portion A-3 of screen A, and the electronic device 100-4 that projects portion A-4 of screen A.
According to an exemplary embodiment, one electronic device may be connected with a plurality of external devices. As shown in
According to an exemplary embodiment, a plurality of electronic devices may be connected with one external device. For example, the plurality of electronic devices 100-1, 100-2, 100-3, and 100-4 may also be connected with any one of the external device 61, the external device 62, the external device 63, and the external device 64.
Referring to
The arrangements shown in
According to one or more exemplary embodiments, electronic devices may implement the expanded projection image by mutually projecting one image and may also respectively project different images.
Referring to
Referring to
Referring to
According to an exemplary embodiment, a processor 110 of the electronic device 100 may use a device position determination module 133 to determine whether the electronic device 100 is positioned on a projection surface.
According to an exemplary embodiment, the processor 110 of the electronic device 100 may use the device position determination module 133 to determine whether the electronic device 100 is positioned within a predetermined distance from the projection surface.
When the device position determination module 133 determines whether the electronic device 100 is positioned on the projection surface or within the predetermined distance from the projection surface, the device position determination module 133 may analyze a sensor value measured by one or more sensors included in a sensor module 140. In this case, an available sensor may use one or more of a proximity sensor 140G, an illumination sensor 140H, a Hall sensor (e.g., a Hall effect IC) 140F, an ultrasonic sensor 140D, an infrared sensor 140E, and a pressure sensor.
In operation 420, the electronic device 100 projects content on the basis of the determination of the position.
According to an exemplary embodiment, the processor 110 of the electronic device 100 controls an optical module 160 to project the content according to the determination of whether the electronic device 100 is positioned on the projection surface. That is, the electronic device 100 projects the content when it is determined that the electronic device 100 is positioned on the projection surface, and does not project the content when it is determined that the electronic device 100 is not positioned on the projection surface. For example, the processor 110 of the electronic device 100 may project the content through the optical module 160 when it is determined that the electronic device 100 is positioned on the projection surface. If it is determined that the electronic device 100 is not positioned on the projection surface, although the electronic device 100 is powered on, the processor 110 of the electronic device 100 may prevent the content from being projected through the optical module 160 by powering off the optical module 160 or the illumination module 161 of the optical module 160. In this way, the electronic device 100 may project the content only when the electronic device 100 is positioned on the projection surface and may not project the content by powering off the optical module 160 that projects the content when the electronic device 100 is not positioned on the projection surface, thus preventing unnecessary power consumption.
According to an exemplary embodiment, when the electronic device 100 is positioned on the projection surface, the projected content may be empty content having no contents.
According to an exemplary embodiment, the content projected when the electronic device 100 is positioned on the projection surface may be content containing a designated message. For example, the designated message may include “<the content is ready to be projected>.”
According to an exemplary embodiment, the processor 110 of the electronic device 100 controls the optical module 160 to project the content according to the determination of whether the electronic device 100 is positioned within a predetermined distance from the projection surface. That is, the electronic device 100 projects the content when it is determined that the electronic device 100 is positioned within the predetermined distance from the projection surface, and the electronic device 10 does not project the content when it is determined that the electronic device 100 is not positioned within the predetermined distance from the projection surface. For example, when it is determined that the electronic device 100 is positioned within the predetermined distance from the projection surface, the processor 110 of the electronic device 100 may project the content through the optical module 160. If it is determined that the electronic device 100 is not positioned within the predetermined distance from the projection surface, although the electronic device 100 is powered on, the processor 110 of the electronic device 100 may prevent the content from being projected through the optical module 160 by powering off the optical module 160 or the illumination module 161 of the optical module 160. In this way, the electronic device 100 may project the content when the electronic device 100 is positioned within the predetermined distance from the projection surface and may not project the content by powering off the optical module 160 that projects the content when the electronic device 100 is not positioned within the predetermined distance from the projection surface, thus preventing unnecessary power consumption. In addition, when the electronic device 100 is not positioned on the projection surface but is positioned within the predetermined distance from the projection surface, the electronic device 100 may project content to inform a user of an approximate position of the projection surface in advance when the user positions the electronic device 100, thus providing convenience of use of the electronic device 100.
Referring to
Referring to
Referring to
Referring to
According to an exemplary embodiment, a processor 110 of the electronic device 100 may use a device position determination module 133 to analyze the position of the electronic device 100. The device position determination module 133 may analyze a sensor value measured by one or more sensors included in a sensor module 140 to analyze the position. That is, to determine whether the electronic device 100 is positioned within a predetermined distance from a projection surface. An available sensor may include one or more of a proximity sensor 140G, an illumination sensor 140H, a Hall sensor (e.g., a Hall effect IC) 140F, an ultrasonic sensor 140D, an infrared sensor 140E, and a pressure sensor. The determined distance from the projection surface may be determined in various ways.
In operation 620, the electronic device 100 outputs an indicator for guiding a position of a projection image.
According to an exemplary embodiment, the processor 110 of the electronic device 100 may use the device position determination module 133 to output the indicator for guiding the position of the projection image.
Referring to
Referring to
Referring to
Referring to
It should be understood by those skilled in the art that a means for informing of a position of the projection image of the electronic device 100 may be determined in various ways.
Referring to
According to an exemplary embodiment, a processor 110 of the electronic device 100 may use a device position determination module 133 to determine whether the electronic device 100 is positioned on a projection surface or within a predetermined distance from the projection surface. On the basis of the determination, the processor 110 of the electronic device 100 may determine whether to project content through an optical module.
According to an exemplary embodiment, the processor 110 of the electronic device 100 may use a device position determination module 133 to determine the position of the electronic device 100. When the position of the electronic device 100 is not suitable for the projection surface, the processor 110 of the electronic device 100 may output an indicator for instructing to position the electronic device 100 on the projection surface. In addition, when one or more other electronic devices adjacent to the electronic device 100 are recognized, the processor 110 may also output an indicator for instructing to position the electronic device 100 and the other adjacent electronic devices level with one another. The position determination operation of the electronic device will be described below in detail.
In operation 820, the electronic device 100 analyzes a projection surface.
The processor 110 of the electronic device 100 may use a projection surface analysis module 134 to analyze the projection surface of the electronic device 100 and may use direction information or rotation information of the electronic device 100 to correct geometric information or color information of the projection image or rotate the projection image. The projection surface analysis operation of the electronic device will be described below in detail with reference to
In operation 830, the electronic device 100 acquires content.
The processor 110 of the electronic device 100 may acquire content to be projected from at least one of a communication module 120, an interface 170, and a memory 130 and may perform buffering and/or signal decryption on the acquired content to be shown with a projection module using a signal processing module 132.
In operation 840, the electronic device 100 projects the content.
The processor 110 of the electronic device 100 may project content onto the projection surface using a projection module 162.
Below, the position determination operation of the electronic device will be described in detail with reference to
Referring to
The processor 110 of the electronic device 100 may use the device position determination module 133 to determine whether the projection surface of the electronic device is vertical or horizontal to the ground surface. When the projection surface of the electronic device is horizontal to the ground surface, this indicates that the electronic device is positioned on a bottom surface. When the projection surface of the electronic device is vertical to the ground surface, this indicates that the electronic device 100 is installed on a wall. The device position determination module 133 may use at least one of a gyro sensor 140B and an acceleration sensor 140C included in a sensor module 140 to determine whether the projection surface of the electronic device 100 is vertical or horizontal to the ground surface.
In operation 812, the electronic device determines whether to perform the projection surface position determination process according to whether the projection surface of the electronic device is vertical or horizontal to the ground surface.
The processor 110 of the electronic device 100 may use the device position determination module 133 to determine whether to perform the projection surface position determination process according to whether the electronic device 100 is positioned on the bottom surface or installed on the wall. If the electronic device 100 is installed on the wall, when a horizontal surface of the projection image is not positioned horizontally to, but is inclined from the ground surface (for example, as shown in
Referring to
The processor 110 of the electronic device 100 may use the device position determination module 133 to determine whether the electronic device 100 is suitably positioned on the projection surface. The device position determination module 133 may use at least one of a gyro sensor 140B and an acceleration sensor 140C included in a sensor module 140 to determine whether the electronic device 100 is suitably positioned on the projection surface, i.e., whether the electronic device 100 is positioned horizontally to the ground surface.
In operation 814, when it is determined that the electronic device 100 is not positioned on the projection surface, the electronic device 100 outputs an indicator for instructing to position the electronic device on the projection surface.
When it is determined by the device position determination module 133 that the electronic device 100 is not positioned on the projection surface. That is, if the electronic device 100 is not positioned horizontally to the ground surface, the processor 110 of the electronic device 100 may output an indicator for instructing to position the electronic device 100 on the projection surface.
The electronic device 100 may use at least one of light, sound, a projection image, and vibration as the indicator for instructing to position the electronic device 100 on the projection surface.
Referring to
In
In the above example, the LED color is green when the electronic device 100 is suitably positioned on the projection surface, and the LED color is red when the electronic device 100 is not suitably positioned on the projection surface. However, this is an example, and it should be fully understood by those skilled in the art that other colors may be used as an LED color for representing that the electronic device is suitably positioned on the projection surface and an LED color for representing that the electronic device is not suitably positioned on the projection surface as long as they are distinct from each other.
Furthermore, according to an exemplary embodiment, the electronic device 100 may provide a guide to a user by setting a blinking rate of the LED lamp 192-1 differently in proportion to the degree of how far the electronic device 100 is from horizontality. For example, the electronic device 100 may indicate to the user how far the electronic device 100 is from horizontality by increasing the blinking rate of the LED lamp 192-1 as the electronic device 100 is farther from the horizontal position and by decreasing the blinking rate of the LED lamp 192-1 as the electronic device 100 is closer to horizontality, or vice versa.
Referring to
In
In the above example, the laser color is green when the electronic device 100 is suitably positioned on the projection surface, and the laser color is red when the electronic device 100 is not suitably positioned on the projection surface. However, this is an example, and it should be fully understood by those skilled in the art that other colors may be used as a laser color for representing that the electronic device is suitably positioned on the projection surface and a laser color for representing that the electronic device is not suitably positioned on the projection surface as long as they are distinct from each other.
Furthermore, according to an exemplary embodiment, the electronic device 100 may provide a guide to a user by a blinking rate of the light source such as the laser or the LED lamp differently in proportion to the degree of how far the electronic device 100 is from horizontality. For example, the electronic device 100 may indicate to the user how far the electronic device 100 is from horizontality by increasing the blinking rate of the light source such as the laser or the LED lamp as the electronic device 100 is farther from horizontality and by decreasing the blinking rate of the laser or the LED lamp as the electronic device 100 is closer to horizontality, or vice versa.
Referring to
In
According to an exemplary embodiment, the electronic device 100 may provide a guide to the user by setting a repetition rate of the mechanical sound differently in proportion to the degree of how far which the electronic device 100 is from horizontality. For example, the electronic device 100 may allow the user to be intuitively aware of how far the electronic device 100 is from horizontality by increasing the repetition rate of the mechanical sound as the electronic device 100 is farther from horizontality and by decreasing the repetition rate of the mechanical sound as the electronic device 100 is closer to horizontality.
Referring to
In
In the above example, it should be fully understood by those skilled in the art that a voice guidance message output from the audio module when the electronic device 100 is positioned suitably on the projection surface and a voice guidance message output from the audio module when the electronic device 100 is not positioned suitably may be implemented in various ways.
Referring to
In
In the above example, it should be fully understood by those skilled in the art that a guidance message of an image that is output by a projection module when the electronic device 100 is positioned suitably on the projection surface and a guidance message that is output when the electronic device 100 is not suitably positioned may be implemented in various ways.
Referring to
In
Referring to
In
According to an exemplary embodiment, when it is determined by the device position determination module 133 that the electronic device 100 is not positioned on the projection surface, the electronic device 100 may be positioned on the projection surface by using a hinge structure included in the electronic device 100 to automatically rotate the electronic device 100.
Referring to the left side of
When it is determined that the electronic device 100 is not positioned on the projection surface, the electronic device 100 may calculate a rotation angle to position the electronic device 100 horizontally, using a value measured by a gyro sensor 1408 or an acceleration sensor 140C included in a sensor module 140.
Referring to the right side of
According to an exemplary embodiment, the electronic device 100 may be rotated on the basis of rotation information. The hinge structure may be built in the electronic device 100, and thus the electronic device 100 itself may be rotated. Also, the hinge structure may be built in the bottom of the optical module of the electronic device 100, and thus, according to an exemplary embodiment, only the optical module of the electronic device 100 may be rotated.
Referring to
The processor 110 of the electronic device 100 may use the device position determination module 133 to recognize the other electronic device adjacent to the electronic device 100. In a method of sensing whether there is an electronic device adjacent to the electronic device 100, signal intensity in wireless communication, or a relative distance measured from an infrared sensor or an ultrasonic sensor may be used.
According to an exemplary embodiment, the device position determination module 133 may use a communication module 120 to measure the signal intensity in the wireless communication and may determine that a nearby electronic device is sufficiently adjacent when the signal intensity is equal to or greater than a certain value. Various communication techniques such as Wi-Fi, Bluetooth, or Zigbee may be used for the wireless communication.
According to an exemplary embodiment, the device position determination module 133 may use an infrared sensor 140E or the ultrasonic sensor 140D for measuring the relative distance between the electronic device 100 and the nearby electronic device to measure the relative distance, and may recognize that the nearby electronic device is positioned when the measured relative distance is equal to or less than a threshold value.
In operation 1120, the electronic device 100 determines that the electronic device is positioned on a level with another adjacent electronic device.
The processor 110 of the electronic device 100 may use the device position determination module 133 to determine whether the electronic device is positioned on a level with the other adjacent electronic device. The device position determination module 133 may use at least one of the gyro sensor 140B and the acceleration sensor 140C included in the sensor module 140 to determine whether the electronic device 100 is positioned on a level with the other adjacent electronic device.
In operation 1130, when it is determined that the electronic device 100 is not positioned on a level with the other adjacent electronic device, the electronic device 100 outputs an indicator for instructing to position the electronic device 100 or the other adjacent electronic device on a level with the other.
When the device position determination module 133 determines that the electronic device 100 is not positioned on a level with the other adjacent electronic device, the processor 110 of the electronic device 100 may output an indicator for instructing to position the electronic device 100 or the other adjacent electronic device level with the other.
The electronic device 100 may use at least one of light, a projection image, and sound as the indicator for instructing to position the electronic device 100 or the other adjacent electronic device on a level with the other. Examples of the indicators that use the light, the sound, and the projection image will be described below.
Referring to
In
In the above example, the LED color is green when the electronic devices 100-1 and 100-2 are positioned on a level with each other, and the LED color is red when the electronic devices 100-1 and 100-2 are not positioned on a level with each other. However, this is an example, and it should be fully understood by those skilled in the art that other colors may be used as an LED color for representing that the electronic devices are positioned on a level with each other and an LED color for representing that the electronic devices are not positioned on a level with each other.
According to an exemplary embodiment, the electronic device 100 may provide a guide to a user by setting blinking rates of the LED lamps 192-1 and 192-2 in proportion to the degree of how far the electronic devices 100-1 and 100-2 are from being level. For example, the electronic devices 100-1 and 100-2 may allow the user to be intuitively aware of how far the electronic devices 100-1 and 100-2 are from being level by increasing the blinking rates of the LED lamps 192-1 and 192-2 as the electronic devices 100-1 and 100-2 are farther from being level and by decreasing the blinking rates of the LED lamps 192-1 and 192-2 as the electronic devices 100-1 and 100-2 are closer to being level, and vice versa.
According to an exemplary embodiment, the LED light form is further used to instruct a user to position the electronic device. That is, while the LED lamp shown in
The LED lamp 192-1 or 192-2 may use the color of the LED lamp or the form of light emitted from the LED lamp to guide a user to position the electronic device 100-1 or 100-2 on a level with the other. For example, the light emitted from the LED lamps 192-1 and 192-2 may be represented as a green dot when the electronic devices 100-1 and 100-2 are positioned on a level with each other and may be represented as a red dot or a red arrow when the electronic devices 100-1 and 100-2 are not positioned on a level with each other. In addition, the LED lamps 192-1 and 192-2 may change the length of the arrows in proportion to a distance from which the electronic devices 100-1 and 100-2 are from being level.
In
In the above example, the LED color is green when the electronic devices 100-1 and 100-2 are positioned on a level with each other, and the LED color is red when the electronic devices 100-1 and 100-2 are not positioned on a level with each other. However, this is an example, and it should be fully understood by those skilled in the art that other colors may be used as an LED color for representing that the electronic devices are positioned on a level with each other and an LED color for representing that the electronic devices are not positioned on a level with each other.
Referring to
According to an exemplary embodiment, when a plurality of electronic devices are adjacent to one another, one of the plurality of electronic devices may be set as a reference electronic device, and the reference electronic device may provide a guide for positions where other electronic devices are to be positioned to be on a level with one another. For example, when the electronic devices are not on a level with one another, the reference electronic device out of the plurality of electronic devices may provide a guide for the positions where the electronic devices are to be positioned to be on a level with one another using red light of a light source such as the laser or the LED lamp. On the other hand, when the electronic devices are on a level with one another, the reference electronic device may display that the electronic devices are on a level with one another using green light of the light source such as the laser or the LED lamp.
In
In the lower left diagram of
In the above example, the laser color is green when the electronic devices 100-1 and 100-2 are positioned on a level with each other, and the laser color is red when the electronic devices 100-1 and 100-2 are not positioned on a level with each other. However, this is an example, and it should be fully understood by those skilled in the art that other colors may be used as a laser color for representing that the electronic devices are positioned on a level with each other and a laser color for representing that the electronic devices are not positioned on a level with each other.
Referring to
The reference electronic device among the plurality of electronic devices may use laser to provide a guide about points where the other electronic devices are to be positioned to be on a level with one another. In addition, the reference electronic device may display a plurality of points where the other electronic devices are to be positioned and may project, through the optical module, a guidance image about an aspect ratio that may be configured for each point.
In
If the user positions the electronic device 100-2 in position P3 guided by the electronic device 100-1 as shown at left side of
Referring to
Referring to upper left
According to an exemplary embodiment, when the device position determination module 133 determines that the electronic device 100 is not positioned horizontally, the electronic device 100 may be positioned on the projection surface by using a hinge structure included in the electronic device 100.
Referring to left side of
Similarly, an electronic device 100-2 determines that the electronic device 100-2 is not positioned on a projection surface 1300-2 because the projection surface 1300-2 is not positioned horizontally to the ground surface. When it is determined that the electronic device 100-2 is not positioned on the projection surface, the electronic device 100-2 may calculate a rotation angle to position the electronic device 100 horizontally, using a value measured by a gyro sensor 140B or an acceleration sensor 140C included in a sensor module 140.
Referring to the right side of
In this way, when a plurality of electronic devices are adjacent to one another, and at least one of the plurality of electronic devices is not positioned horizontally, a plurality of projection surfaces may be arranged to expand a projection image by rotating the electronic device that is not positioned horizontally using the hinge structure of the electronic device.
Referring to the left side of
Similarly, an electronic device 100-2 determines that the electronic device 100-2 is not positioned on a projection surface 1400-1 because the projection surface 1400-2 is not positioned horizontally to the ground surface. When it is determined that the electronic device 100-2 is not positioned on the projection surface, the electronic device 100-2 may calculate a rotation angle to position the electronic device 100-2 horizontally, using a value measured by a gyro sensor 140B or an acceleration sensor 140C included in a sensor module 140.
Referring to the right side of
However, as shown at right side of
In addition, when the electronic device 100-1 and the electronic device 100-2 recognize each other, i.e., when the electronic device 100-1 recognizes that there is the electronic device 100-2 in the vicinity of the electronic device 100-1, the electronic device 100-1 may expand the projection image using the projection surface of the electronic device 100-2. In this case, as shown at right side of
According to an exemplary embodiment, when it is determined by a device position determination module that the device is not positioned suitably, the electronic device 100 may enter a sleep mode or a standby mode.
As an example in which the device is not positioned appropriately, there may be a case in which the user carries the electronic device 100 instead of laying the electronic device 100 on a floor or installing the electronic device 100 on a wall.
An example in which the device is not positioned appropriately may include a case in which, when the user may install the electronic device on a wall, the electronic device 100 is not positioned horizontally.
The sleep mode or the standby mode refers to a mode in which an illumination system of an optical module is turned off to not output the projection image. Thus, the battery consumption caused by unnecessarily projecting an image may be reduced.
According to an exemplary embodiment, the electronic device 100 may analyze the projection surface and correct the image using the projection surface analysis module 134.
According to an exemplary embodiment, the electronic device 100 may sense direction information of the electronic device 100 to correct a distorted image. For example, the electronic device 100 may measure the degree to which the electronic device 100 is rotated using an acceleration sensor or a gyro sensor and may correct the projection image.
Because the direction of the optical module of the electronic device 100 is not exactly horizontal but is inclined, an image projected from the optical module is not displayed horizontally. In this case, the electronic device 100 may use rotation information of the electronic device 100 to correct and output only the projection image while leaving the projection surface as it is.
According to an exemplary embodiment, the electronic device 100 may use a camera included in the electronic device 100 to analyze an image projected from the electronic device and correct the image. In addition, according to an exemplary embodiment, a situation in which the electronic device is positioned on the projection surface may be captured using an electronic device including a camera different from that of the electronic device, and the degree to which the projection surface is corrected may be analyzed through the captured image. According to an exemplary embodiment, the electronic device may use a camera module included therein to correct color information and geometric information of the projected surface.
The geometric correction denotes that an image is corrected in consideration of concave and convex portions when the projection surface geometrically has concave and convex portions.
According to an exemplary embodiment, when the electronic device 100 is attached or hung on a wall and configured to project an image, one or more threshold values may be set for an angle between the ground surface and the projection surface, and the image may be corrected differently depending on each threshold value.
An example in which the correction is not performed when an angle between the ground surface and the projection surface of the electronic device is equal to or less than a first threshold and the correction is performed when the angle is greater than a second threshold will be described below with reference to
Referring to
The projection surface analysis module 134 may measure an angle between the projection surface and the ground surface by using at least one of a gyro sensor and an acceleration sensor included in a sensor module.
In operation 1620, the processor 110 of the electronic device 100 does not correct the projection image when the angle between the projection surface and the ground surface that is measured by the projection surface analysis module 134 is less than the predetermined value.
The predetermined value may be determined in various ways. Referring to
Referring to
In operation 1630, the projection surface analysis module 134 corrects the projection image when the angle between the projection surface and the ground surface is greater than or equal to a predetermined value.
Referring to
An example in which the correction is performed when an angle between the ground surface and the projection surface of the electronic device is equal to or less than a first threshold and the correction is not performed when the angle is greater than a second threshold will be described below with reference to
Referring to
The projection surface analysis module 134 may measure an angle between the projection surface and the ground surface by using at least one of a gyro sensor and an acceleration sensor included in a sensor module.
In operation 1820, the processor 110 of the electronic device 100 corrects the projection image when the angle between the projection surface and the ground surface that is measured by the projection surface analysis module 134 is less than a predetermined value.
Referring to left side of
In operation 1830, the projection surface analysis module 134 projects an image without correcting the projection image when the angle between the projection surface and the ground surface is less than a predetermined value.
Referring to left side of
According to an exemplary embodiment, when an angle between the ground surface and a projection surface of an electronic device 100 installed on a wall is greater than θ2 and less than θ3, the image may be changed to a landscape mode and then shown or may be corrected to be projected horizontally to the ground surface.
Referring to left side of
According to an exemplary embodiment, when the projection surface of the electronic device is positioned horizontally to the ground surface, i.e., when the electronic device is placed on a floor or a table, the electronic device 100 may not correct the screen according to a geomagnetic sensor included in the electronic device. That is, when a change in a value with respect to a z axis is equal to or less than a threshold value and a position of the electronic device is changed only in an x-axis direction and a y-axis direction, the electronic device 100 may not correct the screen according to a geomagnetic sensor included in the electronic device. In other words, by determining whether the electronic device is attached or hung on a wall to perform projection and whether the electronic device is placed on a floor to perform projection, a correction scheme based on values according to the acceleration sensor included in the electronic device 100 may be set differently.
Referring to
Thus, the user may allow the electronic device 100 to project an image such that the projection surface faces a direction in which the user is positioned by easily moving the electronic device 100 or by changing the direction of the electronic device 100 and then positioning the electronic device 100, regardless of a direction of a table at which the user is positioned.
According to an exemplary embodiment, when the size of the floor on which the electronic device 100 performs projection is smaller than the size of the projection surface on which the electronic device 100 performs optical projection, the electronic device 100 may analyze the size of the surface on which the electronic device 100 is to perform projection using image recognition technology and may project an image smaller than the surface to which the image is to be optically projected with software.
Referring to
In this case, the electronic device 100 may analyze the size of the floor to which an image is to be projected, i.e., the size of a space to which an image is to be projected, and may adjust the size of the image to be projected in accordance with the size of the floor to which the image is to be projected. Referring to
According to an exemplary embodiment, the electronic device 100 projects content when there are not nearby electronic devices, but the electronic device 100 may project content along with another nearby electronic device when the electronic device 100 recognizes the other nearby electronic device.
According to an exemplary embodiment, when images are combined using a plurality of electronic devices, one image may be synthesized from the images of the plurality of electronic devices in consideration of information about directions of two or more electronic devices and information about a distance between the electronic devices. A method of determining whether there are a plurality of nearby electronic devices is described above.
A device having a camera, for example, a smartphone 4, captures two or more projection images and analyzes the captured images to check an overlap of the projection images and a separation region and transmits corresponding information to an electronic device 100-1. The electronic device 100-1 may receive information about the overlap region or the separation region and may perform a synthesis process on the overlap region or the separation region to form one natural screen.
Referring to
According to an exemplary embodiment, an image may be acquired using a device having a camera and corrected by passively capturing edges of the projected surface using an additional input or an additional mechanism of a user, for example, a stylus.
Referring to
A user may view screen A-1 and screen A-2 captured by the smartphone 4 and check an overlap region of images by touching vertices 41, 43, 45, and 47 of the four corners of screen A-1 and also touching vertices 42, 44, 46, and 48 of the four corners of screen A-2.
The smartphone 4 may check the overlap region having vertices 42, 43, 46, and 47 according to the stylus touch input and send information about the overlap region to the electronic device 100-1 and the electronic device 100-2. The electronic device 100-1 and the electronic device 100-2 may naturally connect the images by adjusting alpha values in screen A-1 and screen A-2 on the basis of a corresponding overlap region.
Referring to
A processor 110 of the electronic device 100 may use a device position determination module 133 to determine the position of the electronic device 100. When the position of the electronic device 100 is not suitable for a projection surface, the processor 110 of the electronic device 100 may output an indicator for instructing to position the electronic device 100 on the projection surface. In addition, when one or more other adjacent electronic devices are recognized, the processor 110 may also output an indicator for instructing to position the electronic device 100 and the other adjacent electronic devices level with one another. The position determination operation of the electronic device is the same as described above with reference to
In operation 2420, the electronic device 100 determines the expandability of a projection image.
The processor 110 of the electronic device 100 may use a projection image expandability determination module 135 to determine whether to expand the projection image. To determine whether to expand the projection image, the projection image expandability determination module 135 may determine whether there is another electronic device in the vicinity of the electronic device 100 using an infrared sensor 140E or an ultrasonic sensor 140D included in a communication module 120 or a sensor module 140. When the projection image expandability determination module 135 determines that there is another electronic device in the vicinity of the electronic device 100 and that an image is to be expanded and projected, the projection image expandability determination module 135 performs a process for the plurality of electronic devices to project the image according to the determination. The expandability determination operation of the electronic device will be described below in detail with reference to
In operation 2430, the electronic device 100 acquires content.
The processor 110 of the electronic device 100 may acquire content to be projected from at least one of a communication module 120, an interface 170, and a memory 130 and may perform buffering or signal decryption so that the acquired content is shown with a projection module, using a signal processing module 132.
In operation 2440, the electronic device 100 analyzes a projection surface.
The processor 110 of the electronic device 100 may use a projection surface analysis module 134 to analyze the projection surface of the electronic device 100 and may use direction information or rotation information of the electronic device 100 to correct geometric information or color information of the projection image or rotate the projection image. The projection surface analysis operation of the electronic device is the same as described in detail with reference to
According to an exemplary embodiment, when the projection image expandability determination module 135 determines to expand the projection image, the projection surface analysis module 134 may determine some content to be projected by the electronic device 100 and some content to be transmitted to another nearby electronic device from all content to be projected.
In operation 2450, the electronic device 100 projects the content.
The processor 110 of the electronic device 100 may project content onto the projection surface using an optical module 160. When it is determined to expand the projection image, the optical module 160 may project some of the content.
Referring to
In operation 2520, the electronic device 100 determines the expandability of the projection image depending on whether there is at least one other electronic device in the vicinity of the electronic device 100.
When there is not at least one other electronic device in the vicinity of the electronic device 100, the electronic device 100 may solely perform an operation of projecting an image.
When it is determined that there is at least one other electronic device in the vicinity of the electronic device 100, the electronic device 100 and the other electronic device may project content mutually or separately through a setting in the electronic device 100 or through a setting of an application in a device that transmits content to be projected to the electronic device 100.
Referring to
A processor 110 of the electronic device 100 may use a projection image expandability determination module 135 to determine to expand the projection image when there is at least one other electronic device in the vicinity of the electronic device 100.
In operation 2620, the electronic device 100 divides the projection image into at least one image.
The projection image expandability determination module 135 may divide the projection image into one or more images in order for a plurality of electronic devices to mutually project the projection image.
According to an exemplary embodiment, for example, when two electronic devices mutually project an image, the projection image expandability determination module 135 may divide the image into two and determine that one is to be projected by the electronic device 100 and the other is to be projected by an adjacent electronic device.
In operation 2630, the electronic device 100 transmits the at least one separate projection image to at least one other electronic device.
The projection image expandability determination module 135 transmits the at least one separate projection image to at least one other electronic device.
According to an exemplary embodiment, the electronic device 100 may transmit only some of the projection images to be projected by an adjacent electronic device to the adjacent electronic device.
According to an exemplary embodiment, the electronic device 100 may transmit all of the projection images to the adjacent electronic device and may also inform the adjacent electronic device of information about which portion of the projection image is to be projected by the adjacent electronic device.
According to an exemplary embodiment, a plurality of electronic devices may communicate with one another to transmit or receive data. The communication may use various methods such as the User Datagram Protocol (UDP), the Transmission Control Protocol (TCP), etc.
According to an exemplary embodiment, a first electronic device that has received data from a smart device may sequentially transfer data to a plurality of other nearby electronic devices.
Referring to
According to an exemplary embodiment, a first electronic device that has received data from a smart device may sequentially deliver data to a plurality of other nearby electronic devices, and each of the electronic device that have received the data may transfer the data to other adjacent electronic devices. Screen synchronization may be performed with respect to a time at which at least one of synched electronic devices completes data reception.
Referring to
According to an exemplary embodiment, a first electronic device that has received data from a smart device may transmit data to two or more of a plurality of adjacent electronic devices at the same time, and each of the adjacent electronic devices that have receives the data may also transfer the received data to the two or more of the adjacent electronic devices at the same time (e.g., simultaneously).
Referring to
Referring to
Next, in operation 2860, the electronic device 100 processes a user input.
The processor 110 of the electronic device 100 may sense a user interaction using a user input processing module 136 and may perform an operation corresponding to the sensed user interaction. The user input processing module 136 may sense the user interaction using at least one of sensors included in an input device 150, a camera module 191, and a sensor module 140.
According to an exemplary embodiment, the user input processing module 136 may perform an operation corresponding to a user input received through a key 151 included in the input device 150.
According to an exemplary embodiment, the user input processing module 136 may perform an operation corresponding to a user input received through a microphone 182. For example, the electronic device 100 may adjust a projection image of the electronic device 100 using a voice received through the microphone 182. The electronic device 100 may also directly recognize the voice of the user using the microphone 182 included in the electronic device 100 and may recognize a voice recognition instruction through a microphone of an external device connected with the electronic device 100 and perform an operation corresponding to the recognized instruction. As the recognized instruction, various instructions such as Play, Pause, Prev, Next, or Volume Adjustment may be transferred via a voice.
According to an exemplary embodiment, the user input processing module 136 may perform an operation corresponding to a user input received through an infrared (IR) sensor 140E included in the sensor module 140. According to an exemplary embodiment, the infrared sensor 140E of the electronic device 100 may sense an input of a user's finger or a pen from the projection image. The infrared sensor 140E includes a light emitting unit and a light receiving unit. The amount of voltage applied to the light receiving unit varies depending on the amount of infrared light reflected by an object and then received by the light receiving unit out of infrared light emitted from the light emitting unit.
A method of receiving a user input using an infrared sensor will be described with reference to
Referring to
According to an exemplary embodiment, the user input processing module 136 may perform an operation corresponding to a user input received through the camera module 191.
Referring to
In addition, according to an exemplary embodiment, a camera included in an external device connected with the electronic device 100 instead of through the camera module 191 included in the electronic device 100 recognizes the user's gesture and transmits an instruction corresponding to the recognized gesture to the electronic device 100. Thus, the electronic device 100 may process an operation for the instruction.
Referring to
A processor 110 of the electronic device 100 may use a user input processing module 136 to sense the user input that instructs to move the object displayed on the projection image. The user input processing module 136 may use the infrared sensor 140E or the camera module 191 to sense the user input. The object displayed on the projection image may include, for example, an icon, an image, text, a button, etc.
In operation 3120, the electronic device 100 outputs a projection image over which an object moves in response to the sensed user input.
When an operation of touching any object displayed on the projection image with a user input unit such as a user finger 50 or a pen 60 is recognized through the camera module 191 or the infrared sensor 140E, the user input processing module 136 may sense that the movement of the object is started. Next, the camera module 191 or the infrared sensor 140E may trace a path in which the user moves an object displayed over the projection image, and the projection module 162 may project an image in which the object is moved in real time in accordance with the traced path, thus making a user experience a feeling as if the user drags the object with an interaction unit such as a finger.
Referring to
A processor 110 of the electronic device 100 may use a user input processing module 136 to sense the user input that instructs to move the object displayed on the projection image. As described above, the user input processing module 136 may use the infrared sensor 140E or the camera module 191 to sense the user interaction.
The user interaction for moving an object displayed on a projection image to a projection image corresponding to another electronic device may be determined in various ways. For example, the user interaction may be an operation of a user input unit for moving an object to another projection image, an operation of a user input unit for drawing a circle on the object, or a gesture after an operation on the object is designated as a user input unit.
In operation 3220, the electronic device 100 transmits information on an object to another electronic device in response to the sensed user input.
A processor 110 of the electronic device 100 may use a user input processing module 136 to transmit the information on the object to the other electronic device in response to the sensed user input. The user input processing module 136 may use a communication module 120 to transmit information about the object to the other electronic device. An operation described in
Referring to
When a user performs an operation of instructing to move an object 90 displayed on a projection image of screen A using an interaction unit such as his or her finger 50 or a pen, a camera 191-1 of the electronic device 100-1 may sense the user interaction, and a projection module of the electronic device 100-1 may project an image in which the object is moved according to the movement of the user input unit.
The electronic device 100-1 may transmit information on the object to the electronic device 100-2 through a network at a predetermined time while the object is moved from screen A to screen B and then output by an instruction of the user input unit. For example, a predetermined time at which the object is transmitted from the electronic device 100-1 to the electronic device 100-2 may be a time point at which the object crosses to screen B when, while, or after the object enters a boundary between screen A and screen B or a time point at which the user releases an instruction of the object similarly to a touch release.
In addition, by transmitting the information on the object from the electronic device 100-1 to the electronic device 100-2, the object 90 may be moved from the electronic device 100-1 to the electronic device 100-2, or the object 90 may be copied from the electronic device 100-1 to the electronic device 100-2.
According to an exemplary embodiment, when the electronic device 100-1 transmits object information to the electronic device 100-2, the electronic device 100-1 may use various communication technologies. The electronic device 100-1 and the electronic device 100-2 may transmit and receive information in the same AP zone through Wi-Fi technology or may be connected through Bluetooth to transmit and receive information. In addition, the information may also be transferred without a separate AP by utilizing technology such as Wi-Fi Direct. In addition, when the electronic device 100-1 and the electronic device 100-2 are not connected (e.g., paired) in advance, the connection may be requested from an electronic device that displays screen A or an electronic device that displays screen B at a time at which the object information is passed.
Referring to
In addition, when the object 90 is moved and output to screen B, the electronic device 100-2 may recognize, through a camera module 191-2, a user interaction in which the user directs and moves the object on screen B and may project an image according to the movement of the object.
When a user performs an operation of manipulating an object 90 displayed on a projection image of screen A using an input unit such as his or her finger 50 or a pen, a camera 191-1 of an electronic device 100-1 may sense the user input, and a projection module of the electronic device 100-1 may project an image in which the object is moved according to the movement of the user input unit. In addition, the electronic device 100-1 may transmit information on an object 90 to an electronic device 100-2 over a network.
According to an exemplary embodiment, a manipulation of the object 90 with a user input unit may include a gesture of the user input unit.
Referring to
An example in which the object 90 is output to a screen of an image projected by the electronic device 100-2 according to the operation corresponding to the user gesture is shown in
Exemplary embodiments in which an object is moved according to a user input when a plurality of electronic devices are disposed adjacent to one another will be described with reference to
Referring to
According to an exemplary embodiment, when a plurality of electronic devices are disposed adjacent to one another, in response to a gesture of a user input unit of moving an object displayed on one screen toward a screen positioned in another direction, a representation of the object may be moved in a direction of the corresponding screen, or information on the object may be transmitted to an electronic device that projects the corresponding screen.
For example, a camera module 191-1 of the electronic device 100-1 may sense one of a gesture 91 of the user moving the object 90 displayed on the projection image of screen A in a direction of screen B, a gesture 92 of the user moving the object 90 in a direction of screen C, and a gesture 93 of the user moving the object 90 in a direction of screen D through an input unit such as the user's finger 50 or the pen. According to a user gesture sensed by the camera module 191-1, the electronic device 100-1 may display the movement of the object in a direction corresponding to the sensed user gesture or may transmit information on the object to an electronic device that projects a screen in a direction corresponding to the sensed user gesture.
Referring to
Referring to
When the user points to the object 95 with the finger 50 while the user interface is displayed, the camera module 191-1 of the electronic device 100-1 may recognize the selection of the user, and the electronic device 100-1 may move the object 90 to an electronic device 100-3 corresponding to the selected object 95 or may transmit information on the object 90 to the electronic device 100-3 in response to the user's selection.
According to an exemplary embodiment, the electronic device may sense a bending of a flexible device as a user input.
Referring to
The electronic device 100-1 may recognize a bending of the flexible device 1 as one instruction. The electronic device 100-1 may move the object 90 displayed on screen A to screen B or may transmit the information on the object 90 to the electronic device 100-2 in response to the recognized instruction.
In this case, the electronic device 100-1 may recognize the bending of the flexible device 1 by capturing the bending using a camera module or by receiving an instruction corresponding to the bending of the flexible device 1 from the flexible device 1.
An operation of moving an object using a plurality of electronic devices may be applied in one or more exemplary embodiments.
According to an exemplary embodiment, when one electronic device outputs a screen that uses a photograph app, and another electronic device outputs a file browser screen, at least one photograph included in the photograph app may be moved to the file browser screen in response to a user interaction of moving the object.
Referring to
An example in which the photograph 73 selected by the user is moved to the photograph app screen 72 projected by the electronic device 100-2 according to the above user gesture is shown in
According to an application example, when each of a plurality of users wants to send a photograph to another user while enjoying the photograph in a large screen using a photograph app, the user may conveniently and easily transfer the photograph to another user.
According to an exemplary embodiment, when one electronic device outputs one PPT screen, and another electronic device outputs another PPT screen, an object included in one PPT screen may be moved to the other PPT screen in response to a user input for moving the object.
Referring to
An example in which the object 76 selected by the user is moved to the second PPT screen 75 projected by the electronic device 100-2 according to the above user gesture is shown in
According to an exemplary embodiment, when each of a plurality of users wants to send an object to another user while performing a task of making a large screen using PPT screens, the user may conveniently and easily transfer the object to another user.
In addition, when one electronic device projects a webpage screen, and another electronic device projects a chatting program screen, a message or image displayed in a webpage screen may be moved to the chatting program screen in response to a user interaction for selecting and moving the message or image.
Referring to
According to an exemplary embodiment, projection is performed onto a projection surface at the side even in a short projection distance by using an ultra-short focus lens in a combination of lenses of the projection system lens unit 13.
According to an exemplary embodiment, the optical system may be configured by horizontally positioning the illumination optical system and the projection system lens unit, excluding the prism.
According to an exemplary embodiment, the electronic device may be folded to reduce its size when the electronic device is carried and expand the housing 16 to secure a projection distance when the electronic device actually performs projection by adjusting a length such that the housing 16 may be expanded and folded.
Referring to
Referring to
Referring to
Referring to
As shown in
Referring to
For an airplane, when the electronic device 100 is installed to project an image, an image having a greater size than the folding table may be projected, thus allowing the user to enjoy a slightly larger screen.
To secure the projection surface, as described above, the projection surface needed to project an image may be secured by installing an extendable electronic device.
Referring to
Referring to
In addition, by installing a plurality of electronic devices to expand a projection surface, it is possible to utilize the entire windshield as one display to display information about nearby vehicles on a display in a situation in which it is difficult to drive a vehicle, such as a foggy road, thereby facilitating safe driving. While the vehicle is stopped, the expanded projection surface may be used to watch a movie on a large screen.
Referring to
According to an exemplary embodiment, when the electronic devices may be attached to a ceiling of the vehicle, a vehicle passenger may enjoy a movie while he or she leans back in his or her seat.
The term “module” used in one or more exemplary embodiments may refer to a unit including a combination of one or more of hardware, software, and firmware. The “module” may be interchangeably used with a term such as a unit, logic, a logical block, a component, or a circuit. The “module” may be a minimum unit of parts formed integrally as one piece or a part thereof. The “module” may be a minimum unit that performs one or more functions or a part of the minimum unit. The “module” may be mechanically or electronically implemented. For example, the “module” according to one or more exemplary embodiments may include at least one of an application-specific integrated circuit (ASIC) chip, field programmable gate arrays (FPGAs), and a programmable-logic device that are well-known or to be developed, which perform certain operations.
According to one or more exemplary embodiments, at least a part of a device (e.g., modules or functions thereof) or a method (e.g., operations) according to one or more exemplary embodiments may be implemented with instructions stored in, for example, a non-transitory computer-readable storage medium in the form of a programming module. When the instructions are executed by one or more processors (e.g., processor 110), the one or more processors may perform functions corresponding to the instructions. The non-transitory computer-readable storage medium may be, for example, the memory 130. For example, at least a part of the programming module may be implemented (e.g., executed) by the processor 110. For example, at least a part of the programming module may include at least one of a module, a program, a routine, sets of instructions, and a process to perform one or more functions.
The non-transitory computer-readable storage medium may include magnetic media such as a hard disk, a floppy disk, and a magnetic tape, optical media such as a compact disc read only memory (CD-ROM) and a digital versatile disc (DVD), magneto-optical media such as a floptical disk, and a hardware device configured to store and execute program instructions (e.g., a programming module) such as a read only memory (ROM), a random access memory (RAM), and a flash memory. The program instructions may include a mechanical language code generated by a compiler, and a high-level language code that may be executed by a computer using an interpreter. The hardware device may be configured to operate as one or more software modules for performing operations according to one or more exemplary embodiments or vice versa.
A module or programming module according to one or more exemplary embodiments may include at least one of the foregoing components, or some of the foregoing components may be omitted or other components may be further included. Operations executed by a module, a programming module, or other components according to one or more exemplary embodiments may be performed sequentially, in parallel, repetitively, or heuristically. Some operations may be executed in another order or may be omitted, or other operations may be added.
According to one or more exemplary embodiments, in a non-transitory computer-readable recording medium having recorded thereon instructions, the instructions are set such that the at least one processor performs at least one operation when the instructions are executed by the at least one processor, and the at least one operation includes determining whether the electronic device is positioned on a projection surface, providing a guide to position the electronic device on the projection surface according to the determination, and projecting the content onto the projection surface using the projection module.
According to one or more exemplary embodiments, a user may easily position his or her content on a projection surface such as a wall or a ceiling without focus or keystone adjustment, by positioning the electronic device including an optical module on the projection surface through a user interaction.
According to an exemplary embodiment, power is saved by performing projection when the electronic device is positioned on the projection surface or positioned within a predetermined distance from the projection surface and by shutting off the optical module at other times.
According to an exemplary embodiment, a user may expand the projection surface to enlarge a screen and share content by positioning two or more electronic devices adjacent to one another.
The above-described exemplary embodiments are intended for purposes of illustration to describe technical details and help in understanding the present disclosure, and are not intended to limit the scope of the present disclosure. While the present disclosure has been shown and described with reference to one or more exemplary embodiments, it should be understood by those skilled in the art that various changes in form and details may be made without departing from the spirit and scope of the present disclosure, as defined by the appended claims and their equivalents.
Kim, Ji-Hyun, Kim, Hee-Kyung, Kang, Myung-Su, Kim, Hyo-Jung, Kim, Jung-Hyung, Park, Jung-chul, Song, Se-Jun, Namgoong, Bo-Ram, Yang, Sung-kwang, Lee, Yoon-gi
Patent | Priority | Assignee | Title |
11869213, | Jan 17 2020 | Samsung Electronics Co., Ltd. | Electronic device for analyzing skin image and method for controlling the same |
Patent | Priority | Assignee | Title |
5382990, | Jan 10 1992 | Sony Corporation | Projector of a rear projection type |
7114813, | May 02 2003 | Seiko Epson Corporation | Image processing system, projector, program, information storage medium and image processing method |
7724206, | Aug 01 2005 | Seiko Epson Corporation | Position adjustment method for projection images |
8152312, | Mar 09 2006 | Sony Corporation | Apparatus and method that present projection image |
8267524, | Jan 18 2008 | Seiko Epson Corporation | Projection system and projector with widened projection of light for projection onto a close object |
8277057, | Apr 08 2009 | Sanyo Electric Co., Ltd. | Projection display apparatus |
8340432, | May 01 2009 | Microsoft Technology Licensing, LLC | Systems and methods for detecting a tilt angle from a depth image |
8490002, | Feb 11 2010 | Apple Inc.; Apple Inc | Projected display shared workspaces |
8508671, | Sep 08 2008 | Apple Inc. | Projection systems and methods |
8693807, | Oct 20 2008 | GOOGLE LLC | Systems and methods for providing image feedback |
8992050, | Feb 05 2013 | Amazon Technologies, Inc | Directional projection display |
20080212039, | |||
20090073393, | |||
20090147272, | |||
20090207322, | |||
20090251622, | |||
20100079653, | |||
20100182234, | |||
20100259767, | |||
20110164192, | |||
20110191690, | |||
20140039674, | |||
20140368754, | |||
20160062407, | |||
20160295185, | |||
CN101859053, | |||
CN102375717, | |||
EP2420914, | |||
JP200986048, | |||
JP201460549, | |||
JP5334218, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Mar 15 2016 | LEE, YOON-GI | SAMSUNG ELECTRONICS CO , LTD | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 038440 | /0132 | |
Mar 15 2016 | KANG, MYUNG-SU | SAMSUNG ELECTRONICS CO , LTD | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 038440 | /0132 | |
Mar 15 2016 | YANG, SUNG-KWANG | SAMSUNG ELECTRONICS CO , LTD | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 038440 | /0132 | |
Mar 15 2016 | PARK, JUNG-CHUL | SAMSUNG ELECTRONICS CO , LTD | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 038440 | /0132 | |
Mar 16 2016 | KIM, JI-HYUN | SAMSUNG ELECTRONICS CO , LTD | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 038440 | /0132 | |
Mar 17 2016 | KIM, JUNG-HYUNG | SAMSUNG ELECTRONICS CO , LTD | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 038440 | /0132 | |
Mar 17 2016 | SONG, SE-JUN | SAMSUNG ELECTRONICS CO , LTD | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 038440 | /0132 | |
Mar 18 2016 | NAMGOONG, BO-RAM | SAMSUNG ELECTRONICS CO , LTD | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 038440 | /0132 | |
Mar 25 2016 | KIM, HYO-JUNG | SAMSUNG ELECTRONICS CO , LTD | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 038440 | /0132 | |
Mar 30 2016 | KIM, HEE-KYUNG | SAMSUNG ELECTRONICS CO , LTD | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 038440 | /0132 | |
Apr 15 2016 | Samsung Electronics Co., Ltd. | (assignment on the face of the patent) | / |
Date | Maintenance Fee Events |
Sep 10 2022 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Date | Maintenance Schedule |
Apr 02 2022 | 4 years fee payment window open |
Oct 02 2022 | 6 months grace period start (w surcharge) |
Apr 02 2023 | patent expiry (for year 4) |
Apr 02 2025 | 2 years to revive unintentionally abandoned end. (for year 4) |
Apr 02 2026 | 8 years fee payment window open |
Oct 02 2026 | 6 months grace period start (w surcharge) |
Apr 02 2027 | patent expiry (for year 8) |
Apr 02 2029 | 2 years to revive unintentionally abandoned end. (for year 8) |
Apr 02 2030 | 12 years fee payment window open |
Oct 02 2030 | 6 months grace period start (w surcharge) |
Apr 02 2031 | patent expiry (for year 12) |
Apr 02 2033 | 2 years to revive unintentionally abandoned end. (for year 12) |