An information-inputting device is provided, which includes a plurality of photographing units photographing an area on a plane. An object located on the plane is then extracted from an image that includes the plane and the object, and it is determined whether the object is a specific object. If the object is the specific object, a position of a contact point between the specific object and the plane is calculated.
|
1. A processor-implemented method of inputting information comprising the steps of:
extracting an object located on a plane from an image that includes the plane and the object;
determining whether the object is a writing implement;
calculating, with at least one processor, a position of a contact point between the writing implement and the plane as information to be input if the object has been determined as the writing implement; and
inputting the information representing a position on the plane indicated by the object,
wherein when a plurality of objects are extracted from the image, one of the plurality of objects that satisfies a prescribed condition is determined as the writing implement, and the contact position of the writing implement is input as the information.
14. A processor-implemented method of inputting information comprising the steps of:
extracting an object located on a plane from an image that includes the plane and the object;
recognizing whether the object is a writing implement; and
calculating, with at least one processor, a position of a contact point between the writing implement and the plane as information to be input and representing an indicated position on the plane using two-dimensional coordinates if the object has been recognized as the writing implement,
wherein when a plurality of objects are extracted from the image, one of the plurality of objects that satisfies a prescribed condition is determined as the writing implement, and the contact position of the writing implement is input as the position information.
11. A processor-implemented method of inputting information comprising:
extracting an object located on a plane from an image that includes the plane and the object;
determining whether the objected is a writing implement; and
calculating and inputting a position of a contact point between the writing implement and the plane as position information representing a designated position on the plane and represented by two-dimensional coordinates if the object is recognized as the writing implement,
wherein when a plurality of objects are extracted from the image, one of the plurality of objects that satisfies a prescribed condition is determined, with at least one processor, as the writing implement, and the contact position of the writing implement is input as the position information.
0. 29. Apparatus usable with at least one processing structure for inputting information, comprising:
a display having at least one camera at a corners thereof; and
at least one computer readable medium having program instructions configured to cause the at least one processing structure to:
extract a plurality of objects located on a plane of the display device from an image that includes the plane and the object;
recognize whether one of the objects comprises a writing implement by determining that the one object satisfies a prescribed condition; and
calculate a position of a contact point between the writing implement and the plane as position information to be input and representing an indicated position on the plane using two-dimensional coordinates if the object has been recognized as the writing implement.
15. An information-inputting device comprising:
a plurality of photographing units configured to photograph an area on a plane;
an object-recognizing unit configured to extract an object located on the plane from a photographed image and to recognize whether the extracted object is a writing implement; and
a location-calculating processing unit configured to calculate a position of a contact point between the writing implement and the plane from the photographed image if the object has been recognized as the writing implement,
wherein when a plurality of objects are extracted from the image, one of the plurality of objects that satisfies a prescribed condition is determined as the writing implement, and the contact position of the writing implement is calculated by the location-calculating processing unit as the contact point.
0. 27. Apparatus usable with at least one processing structure for inputting information, comprising:
a display having at least one camera at a corner thereof; and
at least one computer readable medium having program instructions configured to cause the at least one processing structure to:
extract a plurality of objects located on a plane of the display device from an image that includes the plane and the objects;
determine whether the object is a writing implement by determining that one of the plurality of objects that satisfies a prescribed condition is the writing implement; and
calculate and input as position information a position of a contact point between the writing implement and the plane as position information representing a designated position on the plane and represented by two-dimensional coordinates if the object is recognized as the writing implement.
0. 16. Apparatus usable with at least one processing structure for inputting information, comprising:
a display device having at least one camera in a corner thereof; and
at least one computer readable medium having program instructions configured to cause the at least one processing structure to:
extract an object located on a plane of the display device from an image that includes said plane and the object;
determine whether the object is a writing implement by determining, when a plurality of objects are extracted from the image, that one of the plurality of objects that satisfies a prescribed condition is the writing implement;
calculate a position of a contact point between the writing implement and said plane as information to be input if the object has been determined as the writing implement; and
input the information representing a position on said plane indicated by the object.
0. 30. Apparatus usable with at least one processing structure for inputting information, comprising:
a display having a plurality of photographing units configured to photograph an area on a plane of the display device; and
at least one computer readable medium having program instructions configured to cause the at least one processing structure to:
extract an object located on the plane from a photographed image and to recognize whether the extracted object is a writing implement;
calculate a position of a contact point between the writing implement and the plane from the photographed image if the object has been recognized as the writing implement;
when a plurality of objects are extracted from the image and one of the plurality of objects satisfies a prescribed condition, determine that (i) that object is the writing implement, and (ii) the contact position of the writing implement is the contact point.
2. The method as claimed in
observing movement of the writing implement approaching the plane if the object has been recognized as the writing implement; and
inputting the contact position of the writing implement on the plane as the information when the writing implement has stopped in a perpendicular direction to the plane.
3. The method as claimed in
4. The method as claimed in
5. The method as claimed in
6. The method as claimed in
7. The method as claimed in
8. The method as claimed in
9. The method as claimed in
10. The method as claimed
12. The method according to
displaying the image based on the information.
13. The method according to
displaying the image based on the locus information.
0. 17. Apparatus usable with at least one processing structure for inputting information according to claim 16, wherein the at least one computer readable medium has program instructions configured to cause the at least one processing structure to:
observe movement of the writing implement approaching the plane if the object has been recognized as the writing implement; and
input the contact position of the writing implement on the plane as the information when the writing implement has stopped in a perpendicular direction to the plane.
0. 18. Apparatus usable with at least one processing structure for inputting information according to claim 16, wherein the at least one computer readable medium has program instructions configured to cause the at least one processing structure to:
recognize whether the object is the writing implement by comparing a shape of the object with a specific shape that has been selected in advance from a plurality of specific shapes.
0. 19. Apparatus usable with at least one processing structure for inputting information according to claim 16, wherein the at least one computer readable medium has program instructions configured to cause the at least one processing structure to:
recognize whether the object is the writing implement by comparing a shape of the object with a registered shape of the object.
0. 20. Apparatus usable with at least one processing structure for inputting information according to claim 16, wherein the at least one computer readable medium has program instructions configured to cause the at least one processing structure to:
determine that the contact position that satisfies the prescribed condition is the contact position located at a shortest distance from a fixed point on the plane among distances from the fixed point to each of the plurality of the contact positions.
0. 21. Apparatus usable with at least one processing structure for inputting information according to claim 16, wherein the at least one computer readable medium has program instructions configured to cause the at least one processing structure to:
input the contact position as valid information if the contact position is located in a fixed area on the plane.
0. 22. Apparatus usable with at least one processing structure for inputting information according to claim 20, wherein the fixed area comprises an area selected from a plurality of areas whose shapes and sizes are different from each other.
0. 23. Apparatus usable with at least one processing structure for inputting information according to claim 20, wherein the fixed area comprises an area that can be set to a fixed shape.
0. 24. Apparatus usable with at least one processing structure for inputting information according to claim 23, wherein the fixed area comprises one of (i) a closed area formed by a track of the contact position of the writing implement on the plane and (ii) an area formed by said track and a width of a predetermined area.
0. 25. Apparatus usable with at least one processing structure for inputting information according to claim 16, wherein the at least one computer readable medium has program instructions configured to cause the at least one processing structure to:
define the contact position as invalid information as well as signaling the invalid information if the contact position is not located in a fixed area.
0. 26. Apparatus usable with at least one processing structure for inputting information according to claim 16, wherein the at least one computer readable medium has program instructions configured to cause the at least one processing structure to:
display the image based on the position information.
0. 28. Apparatus usable with at least one processing structure for inputting information according to claim 26, wherein the at least one computer readable medium has program instructions configured to cause the at least one processing structure to:
display the image based on the position information.
|
1. Field of the Invention
The present invention relates to an information-inputting device that can record data in a recording medium in real time while the data is being written on a recording surface without specifying types of writing implements and materials used for the recording surface.
2. Description of the Related Art
Methods and apparatuses for inputting information have been proposed, and have been put to practical use. For instance, Japanese Laid-open Patent Application No. 6-289989 discloses an information recognition device. The information recognition device provides a camera on one side of a recording surface. The information recognition device detects coordinates of each position of information recorded by a writing implement on the recording surface, for example, a track of letters written on recording paper, and develops depiction data based on the coordinates. The information recognition device alternatively detects coordinates of each position of a writing implement, that is, coordinates of a tip of the writing implement, and develops depiction data based on the coordinates. To recognize the information recorded on the recording surface, the information recognition device also needs to detect the origin of the coordinates on the recording surface by detecting corners of the recording paper, that is, the recording surface, and the origin of the X-axis and the Y-axis recorded on the recording paper.
The information recognition device must detect the corners of the recording paper or the origin of the coordinates based on the information recorded on the recording paper before detecting the coordinates of each position of the information every time the recording paper is exchanged. Additionally, in a case that a position of the recording paper is moved from its original position while recording the information on the recording paper, or in a case that positions of the corners of the recording paper in an image being taken by the camera are moved, coordinates of recording the information on the recording paper are detected by the camera as different coordinates from what they are supposed to be, since the coordinates of recording the information are calculated based on the origin of the coordinates. Consequently, the camera detects different information from what a user intends to record unless a moved origin of the coordinates is detected.
Additionally, when the camera photographs the information recorded on the recording paper from an upper oblique direction of the recording paper, error between actual coordinates of the information recorded on the recording paper and coordinates detected by the camera increases as a distance between the recording paper and the camera increases. Such a problem will be described with reference to
The number of pixels provided in the imaging device necessary for obtaining image data by photographing the coordinates of each position of the information depends on a size of a recording area on the recording paper and a resolution of reading the coordinates of each position of the information recorded on the recording paper. As the size of the recording area increases, the greater the number of pixels necessary for calculating the coordinates from the image data also increases. Additionally, for a higher resolution of reading the coordinates, the number of pixels must be greater. Furthermore, a frame rate of image signals outputted from the imaging device must be high in order to clearly monitor an information recording process from the photographed image data. However, an imaging device with a large number of pixels and a high frame rate is generally expensive, and thus it is hard to hold down production cost of an information input device by mounting such an imaging device thereon. Consequently, a writing input device with a comparatively inexpensive imaging device using a smaller number of pixels has been requested.
Additionally, a size of an image-display device used in a portable writing input device for displaying the image data obtained by the camera is preferred to be small for miniaturization of the device and its electric efficiency. However, if the size of the image-display device is small, the number of pixels displayed on the image-display device becomes small, and thus quality of the image data displayed on the image-display device decreases by stretching the image data immoderately when displaying a page of the image data thereon. Furthermore, when the number of imaging devices used for photographing the information recorded on the recording paper is small, a wide-angle lens should be attached to each of the imaging devices for photographing the information. In such case, a resolution of reading the coordinates of the information recorded on the recording paper differs depending on where the information is recorded on the recording paper.
Accordingly, it is a general object of the present invention to provide an information-inputting device used for writing data on a recording surface by use of a writing implement. A more specific object of the present invention is to provide an information-inputting device, a writing input device and a portable electronic writing input device that are easily carried and are used for writing data on a desired recorded material whose surface is a plane by use of a desired writing implement. Another object of the present invention is to provide a method and an apparatus for recording data in a recording medium in real time while the data is being written on a recording surface by use of a writing implement in a case in which the writing implement is detected. Yet, another object of the present invention is to provide a method of managing written data and a recording medium for creating a page of the written data by dividing the written data into a plurality of parts and by inputting the plurality of parts by a recording area in a case in which a resolution of reading the written data in the recording area is low. Yet, another object of the present invention is to provide a method of managing written data and a recording medium for controlling a pixel density of the written data to be even throughout an entire recording area. Yet, another object of the present invention is to provide a method of controlling display of a recording area to increase operability of writing data in the recording area.
The above-described objects of the present invention are achieved by an information-inputting device including a plurality of photographing units photographing an area on a plane; an object-recognizing unit extracting an object located on the plane from a photographed image, and recognizing whether the object is a specific object; a location-calculating unit calculating a contact position of the specific object on the plane from the photographed image if the object has been recognized as the specific object; and a data-storing unit storing information about a track of the contact position while the specific object is contacting the plane.
The above-described objects of the present invention are also achieved by a method of inputting information including the steps of extracting an object located on a plane from an image that includes the plane and the object; recognizing whether the object is a specific object; and inputting a contact position of the specific object on the plane as information if the object has been recognized as the specific object.
The above-described objects of the present invention are also achieved by a writing input device including an image-inputting unit photographing a recording area on a plane by providing a plurality of electronic cameras that include imaging devices; an object-recognizing unit extracting an object located on the plane from a photographed image, and determining whether the object is a writing implement by recognizing a shape of the object; a coordinate-calculating unit calculating contact coordinates of the object on the plane based on an image of the object on an imaging device if the object has been determined as the writing implement; a data-storing unit storing a series of the contact coordinates while the object is contacting the plane; and a displaying unit creating depiction data from the series of the contact coordinates, and displaying the depiction data thereon.
The above-described objects of the present invention are also achieved by a method of managing written data in a writing input device, wherein the writing input device includes an image-inputting unit photographing a recording area on a plane by providing a plurality of electronic cameras that includes imaging devices; an object-recognizing unit extracting an object located on the plane from a photographed image, and determining whether the object is a writing implement by recognizing a shape of the object; a coordinate-calculating unit calculating contact coordinates of the object on the plane based on an image of the object on an imaging device if the object has been determined as the writing implement; a data-storing unit storing a series of the contact coordinates while the object is contacting the plane; and a displaying unit creating depiction data from the series of the contact coordinates, and displaying the depiction data thereon, the method including the steps of dividing a page of a data area into a plurality of areas; assigning one of the areas to the recording area; and managing the written data to be recorded in the recording area as data of the one of the areas in the page.
The above-described objects of the present invention are also achieved by a portable electronic writing input device including a main body unit; a first camera unit; a second camera unit; and an expansion/contraction unit connecting the first and second camera units on left and right parts of the main body unit as well as expanding or contracting an interval between the main body unit and the first or second camera unit, wherein the portable electronic writing input device, being placed on a plane material, photographs a movement of a writing implement by use of the first and second camera units, when a user writes data on the plane material by using the writing implement.
The above-described objects of the present invention are also achieved by a recording medium readable by a computer, tangibly embodying a program of instructions executable by the computer to perform a method, the method including the steps of extracting an object located on a plane from an image including the object and the plane; recognizing whether the object is a specific object; calculating a contact position of the specific object on the plane if the object has been recognized as the specific object; storing written data including a series of coordinates of the contact position calculated while the object is contacting the plane; generating depiction data from the written data; and displaying the depiction data.
The information-inputting device determines a shape of the object on the plane by use of the object-recognizing unit, and calculates the contact positions of the specific object by use of the location-calculating unit if the object has been determined as the specific object by the object-recognizing unit. Subsequently, the information-inputting device stores the track of the contact position while the specific object is contacting the plane. Thus, the information-inputting device can record data in a recording medium in real time while the data is being written on a recording surface without specifying types of writing implements and materials used for the recording surface.
Other objects, features and advantages of the present invention will become more apparent from the following detailed description when read in conjunction with the accompanying drawings.
A description will now be given of preferred embodiments of the present invention, with reference to the accompanying drawings.
The expansion/contraction units 6 and 7 are expanded and contracted by hand operations, and adjust a distance between the left camera unit 2 and the right camera unit 3. The LCD 8 displays information such as letters that are written in the recording area 13. Additionally, the LCD 8 includes various buttons used for its operations, but the buttons are not shown in
The mirror 14 reflects incident light entering through the photographing window 4 toward the wide-angle lens 15. The wide-angle lens 15 having a view angle of 90 degrees is placed in the left camera unit 2 to transmit the light reflected by the mirror 14 to the CMOS image sensor 16. The CMOS image sensor 16 executes photoelectric conversion on the light received through the wide-angle lens 15 to an analog signal by use of a photo-diode provided for each pixel therein, amplifies the analog signal by using a cell amplifier provided for each pixel, and outputs the amplified analog signal to the image-processing unit 17 every time a fixed period passes. Similarly, in the right camera unit 3, the mirror 19 reflects incident light entering through the photographing window 5 toward the wide-angle lens 20. The wide-angle lens 20 having a view angle of 90 degrees is placed in the right camera unit 3 to transmit the light reflected by the mirror 19 to the CMOS image sensor 21. The CMOS image sensor 21 executes photoelectric conversion on the light received through the wide-angle lens 20 to an analog signal by use of a photo-diode provided for each pixel therein, amplifies the analog signal by using a cell amplifier provided for each pixel, and outputs the amplified analog signal to the image-processing unit 22 every time a fixed period passes.
Each of the image-processing circuits 17 and 22 includes an A/D (Analog/Digital) conversion circuit, and converts an analog signal respectively received from the CMOS image sensors 16 and 21 to a digital signal. Subsequently, the image-processing circuits 17 and 22 execute a process to extract an outline of a subject image from image data obtained by converting the analog signal to the digital signal, an image recognition process to decide whether the subject image is a writing implement based on the extracted outline, and a process to output information about a position where an object detected as the writing implement is contacting the recording surface. It should be noted that the left camera unit 2 and the right camera unit 3 perform the above-described processes synchronously with each other.
The ROM 24 initially stores a program to control the writing input device 1. The main memory 25 includes a DRAM (Dynamic Random Access Memory), and is used as a work area for the CPU 23. The flash memory 26 stores coordinate data, that is, data about the coordinates of the information recorded in the recording area 13. The operation unit 27 includes various types of keys near the LCD 8, the keys being used for displaying the coordinate data stored in the flash memory 26 on the LCD 8, for forwarding the coordinate data to a personal computer through a USB cable, and for other objects. The LCD display-control unit 28 controls displaying of the coordinate data, an operation menu, and the like on the LCD 8. The USB driver 29 transmits data to a device such as a personal computer connected to a USB cable, and receives data from the device, by executing operations based on a USB standard. The sound source 30 generates a sound signal such as an alarm, and then the generated sound signal is outputted from the speaker 32. The battery 34 is, for example, a nickel-metal hydride battery or a lithium battery. An electric current is supplied from the battery 34 through the DC-DC converter 33 to units in the writing input device 1 in addition to the LEDs 9 through 12. The system bus 35 connects the image-processing units 17 and 22, the CPU 23, the ROM 24, the main memory 25, the flash memory 26, the operation unit 27, the LCD display-control unit 28, the USB driver 29, and the sound source 30.
A description will now be given of a method to obtain coordinates of a contact point of a writing implement contacting the recording surface. As described with reference to
θ=arc tan(h/f) EQ1
β1=α−θ EQ2
A value of the angle β1 for the left camera unit 2 can be obtained from the above equations since the angle a is initially measured as an angle for positioning the wide-angle lenses 15 and 20 so that the optical axes of the wide-angle lenses 15 and 20 intersect with each other. Similarly, a value of the angle β2 can be obtained for the right camera unit 3. Once the values of the angles β1 and β2 have been obtained, the coordinates of the contact point A (x, y) can be obtained by using trigonometry.
x=L*tanβ2/(tanβ1+tanβ2) EQ3
y=x*tanβ1 EQ4
Assuming the mirrors 14 and 19 do not exist in the left camera unit 2 and the right camera unit 3, the description has been given of the method to calculate the coordinates of the contact point A, that is, the contact point of the writing implement. However, since the mirrors 14 and 19 are provided in the writing input device 1 only for changing a direction of the optical axis placed in each of the left camera unit 2 and the right camera unit 3 by reflecting light on the optical axis, the above-described method can also be applied to calculate the coordinates of the contact point of the writing implement in a case that there are mirrors placed in each of the left camera unit 2 and the right camera unit 3. As describe above, both an optical axis of light emitted toward the left camera unit 2 and an optical axis of light emitted toward the right camera unit 3 are parallel to a recording surface, and intersect each other. Accordingly, both of the left camera unit 2 and the right camera unit 3 can detect coordinates of a contact point accurately with no errors. In addition, an area where coordinates of the contact point are read by the left camera unit 2 and the right camera unit 3 can be kept wide.
A detailed description will now be given of embodiments based on the writing input device 1 according to the present invention.
A first embodiment of the present invention relates to a method of storing information as digital data in a recording medium in real time while a user is recording the information by use of a pen on a recorded material such as paper. The first embodiment additionally relates to a method of storing information as digital data in a recording medium in real time while a user is recording the information on the recorded material by use of a writing implement such as a stick or a finger whereto ink is not applied.
The image-processing circuit 17 of the left camera unit 2 converts an analog image signal outputted from the CMOS image sensor 16 into a digital image signal in order to obtain a frame of image data, and extracts an outline of an object from the image data at a step S101. In a case that the number of pixels of the CMOS image sensor 16 in a vertical direction (an upward direction of a vertical side 39) is large, the image-processing circuit 17 controls outputting image signals only for pixels of the CMOS image sensor 16 that form an image having a fixed height from a recording surface 38. An example of the above-described method of extracting the outline of the object is to calculate a density gradient between pixels by applying differentiation, and then to define the outline based on the direction of increasing density gradient and the size of the calculated density gradient. Such method is disclosed in Japanese Laid-open Patent Application No. 63-193282. After the outline has been extracted from the image data, the image-processing circuit 17 determines whether the object is a writing implement or not based on a shape of the extracted outline of the object. An example of image recognition technology to determine the shape of the extracted outline is to obtain a center of gravity 40 of the object, to calculate distances between the center of gravity 40 and points on the outline specified by angles based on the center of gravity 40 in order, and then to define the shape of the extracted outline from relations between the distances and the angles. Such a method is disclosed in Japanese Laid-open Patent Application No. 8-315152. Subsequently, data about the shape of the outline obtained by the above-described method is compared with data that has been stored previously as shapes of writing implements in the ROM 24 or in the flash memory 26 at a step S102. Consequently, the object is defined by the image-processing circuit 17 either as a writing implement or as an object other than writing implements at a step S103. If the object is detected as the object other than the writing implements, the image-processing circuit 17 proceeds to the step S101.
Since an angle between the object and the recording surface 38 is not fixed, the data about the object is compared with the data stored in the ROM 24 or the like by rotating a standard line 41 connecting the center of gravity 40 of the object and the outline of the object in an range of certain angles as shown in
If the object has been detected as a writing implement at the step S103, the image-processing circuit 17 determines whether the writing implement has contacted the recording surface 38 as shown in
h=h0−h1
It should be noted that the distances “h0” and “h1” can be obtained from the number of pixels counted from the vertical side 39 to the image-formed locations of the optical axis 18 and the contact point A, and a distance between pixels adjacent to each other (a pixel pitch).
Once a value of the distance “h” has been calculated, a value of the angle β1 can be obtained from the equations EQ1 and EQ2 by use of predetermined values of the distance “f” and the angle a at a step S106. The angle β2 can be obtained similarly by taking the above-described steps in the right camera unit 3. Additionally, at a step S107, the coordinates (x, y) of the contact point A on the recording surface can be obtained from the equations EQ3 and EQ4 by use of the values of the angles β1 and β2, and a predetermined value of the distance L. The CPU 23 may execute a calculation using the equations EQ1 through EQ4. Alternatively, the image-processing circuits 17 and 22 may execute a calculation using the equations EQ1 and EQ2. Additionally, the CPU 23 may execute a calculation using the equations EQ3 and EQ4.
The CPU 23 creates depiction data, for instance, by connecting each set of coordinates by straight lines based on a series of coordinate data of the contact point A that was obtained while the writing implement 36 was contacting the recording surface 38. Subsequently, the CPU 23 displays the depiction data on the LCD 8 through the LCD display-control unit 28 at a step S108, and stores the series of the coordinate data of the contact point A in the flash memory 26, for instance, as a single file, at a step S109.
A description will now be given of a second embodiment of the present invention with reference to
According to the present invention, objects such as a pencil, a stick and a finger may be used as writing implements if they are recognized as writing implements by the writing input device 1. However, the pencil and the stick that can be writing implements have different shapes. Accordingly, in a third embodiment of the present invention, typical shapes of writing implements that are different from each other are initially registered as data in the writing input device 1 so that a user can select one of the typical shapes. If the user selects one of the typical shapes appropriate for a writing implement that is to be used for recording information on the recording surface 38, an area of an imaging device including pixels that output signals generated by photoelectric conversion is changed depending on a selected shape. Accordingly, a load on the writing input device 1 to create image data is reduced.
Therefore, an area on the CMOS image sensors 16 and 21 including pixels that output signals generated by the photoelectric conversion is changed depending to an icon selected by the user. At a step S303, the CPU 23 checks whether the first icon or the second icon has been selected by the user. If it is determined at the step S303 that the first icon or the second icon has been selected, the image-processing circuits 17 and 22 reduce the number of pixels in the direction perpendicular to the recording surface 38 that output signals generated by the photoelectric conversion at a step S304, followed by proceeding to the step S101 of the first embodiment (
A description will now be given of a fourth embodiment of the present invention with reference to
A description will now be given of a fifth embodiment of the present invention with reference to
In the above-described embodiments, the writing input device 1 recognizes only one object for recording information on the recording surface 38. However, there is a case that more than one object, for example, a pen and a finger are recognized simultaneously as writing implements by an image recognition method of the writing input device 1. A sixth embodiment provides a solution to the above-described case. To be concrete, when a plurality of objects has been recognized simultaneously as writing implements by the writing input device 1, the writing input device 1 defines an object that is the closest to the left camera unit 2 and the right camera unit 3 as a writing implement. In the sixth embodiment, a description will be given of a case that there are two objects recognized simultaneously as writing implements. As shown in
If it is determined by both of the image-processing circuits 17 and 22 at the step S602 that the object consists of a plurality of the writing implements as described above, the image-processing circuits 17 and 22 decide whether the writing implements are contacting the recording surface at a step S603 similarly to the step S104 of the first embodiment. If it is determined at the step S603 that the writing implements are contacting the recording surface, the image-processing circuits 17 and 22 calculate coordinates of the contact point A of the ballpoint pen and the contact point D of the middle finger from the equations EQ1 through EQ4. In
L1=√(x2+y2)
L2=√((Xmax−x)2+y2)
Similarly, the distances L1 and L2 for the contact point D are obtained. Subsequently, at a step S604, the sum of the distances L1 and L2 is calculated for each of the contact points A and D. In the sixth embodiment, since the sum for the contact point A is greater than that for the contact point D, the image-processing circuits 17 and 22 define an object contacting the paper at the contact point A, that is, the ballpoint pen as a valid writing implement. At a step S605, in the case that the image-processing circuits 17 and 22 recognize a plurality of writing implements contacting the recording surface, the image-processing circuits 17 and 22 obtain coordinates of a contact point where the sum of the distances L1 and L2 is the smallest, as valid coordinate data. The steps S108 and S109 are executed after the step S605.
If it is determined by both of the image-processing circuits 17 and 22 at the step S602 that the object does not consist of a plurality of writing implements, the image-processing circuits 17 and 22 proceed to a step S606, and check whether a single writing implement has been recognized. If not, the image-processing circuits 17 and 22 proceed to the step S601. If it is determined at the step S606 that a single writing implement has been recognized, the image-processing circuits 17 and 22 obtain coordinates of a contact point of the single writing implement. Subsequently, the steps S108 and S109 of the first embodiment are executed.
If it is determined at the step S603 that a plurality of writing implements are not contacting the recording surface at a step S603, the image-processing circuits 17 and 22 proceed to the step S608, and check whether a single writing implement is contacting the recording surface. If not, the image-processing circuits 17 and 22 proceed to the step S601. If it is determined at the step S608 that a single writing implement is contacting the recording surface, the image-processing circuits 17 and 22 obtain coordinates of a contact point of the single writing implement. Subsequently, the steps S108 and S109 of the first embodiment are executed.
In
According to the sixth embodiment, in a case that a plurality of writing implements such as a pen and a finger is recognized by camera units, an object that is the closest to the camera units is selected as the only writing implement. For example, while a user is writing information on a recording sheet with a pen as well as holding the recording sheet by his or her hand, written data of the pen is recorded in a recording medium as electric data if the pen is placed closer than fingers to the camera units. Additionally, the writing input device 1 according to the sixth embodiment can prevent a user from inputting undesired-information to the writing input device 1 in a case that an object other than writing implements is recognized as a writing implement by mistake.
A seventh embodiment of the present invention enables simple management of data inputted by a writing implement by defining a size of a recording area as a standard paper size.
In an eighth embodiment of the present invention, a user can set a size of the recording area 13 to one of a letter size (A4) with a longer side placed in the vertical direction (an A4 height), the letter size with a shorter side placed in the vertical direction (an A4 width), a legal size (B4) with a longer side placed in the vertical direction (a B4 height), the legal size with a shorter side placed in the vertical direction (a B4 width), and the like. Such information is initially stored in the ROM 24. Additionally, width of the recording area 13 can be altered as a distance changes between the left camera unit 2 and the right camera unit 3 in the eighth embodiment. The expansion/contraction units 6 and 7 shown in
According to the eighth embodiment, since a desired recording area can be selected from a plurality of recording areas whose shapes and sizes are different from each other, operability of the writing input device 1 increases.
A description will now be given of a ninth embodiment of the present invention with reference to
For instance at the step S901, by use of the operation unit 27 and a guidance displayed on the LCD 8, a user sets one of recording-area setting modes wherein the straight lines PQ, PR and QS are fixed so that a user can specify only one side RS of a rectangle PQSR as shown in
After the recording area has been defined at the step S904, the recording-area setting mode is cancelled, and the information-inputting mode is set at a step S905. Steps S906 through S909 correspond to the steps S701 through S704 respectively.
If it is determined that the confirmation key has not been pressed at the step S903, the image-processing circuits 17 and 22 proceed to the step S902. If the recording area 13 could not been formed at the step S904, the CPU 23 displays a message on the LCD 8 at a step S910 to notify a user about failure of the formation of the recording area 13, and proceeds to the step S902.
In the above-described ninth embodiment, the recording area 13 is the rectangle PQSR. However, the writing input device 1 can manage information about the recording area 13 even if the shape of the recording area 13 is a shape other than a rectangle.
According to a tenth embodiment, when the writing input device 1 detects that a user has written information outside the recording area 13, the writing input device 1 notifies a user that an invalid writing operation has executed by sounding an alarm.
In an eleventh embodiment, a frame is provided on the edge of the recording area 13 so that a user can easily notice a range of the recording area 13. Additionally, from an image photographed by the left camera unit 2 and the right camera unit 3, the image-processing circuits 17 and 22 do not detect a contact point of a writing implement on a recorded material such as paper in an area outside the recording area 13 where the contact point is behind the frame, thereby reducing the load on the writing input device 1 to execute image processes.
In a case that the image-processing circuits 17 and 22 extract an outline of an object from an image photographed by the left camera unit 2 and the right camera unit 3, a larger contrast of the object and its surrounding area produces a higher accuracy in extracting the outline of the object. In other words, a larger difference in luminance of the object and its surrounding area produces a higher accuracy in extracting the outline of the object. If a writing implement is specified, a background color of the image photographed by the left camera unit 2 and the right camera unit 3 can be set to a color which is the most appropriate to a color of the writing implement. However, if the color of the writing implement is not specified, the background color of the image should be set to a color by which objects with various colors can be easily extracted from the image. For instance, it is assumed that a finger is used as a writing implement. In such case, when the finger is photographed under a regular room light, luminance of the finger is closer to luminance of a white subject than to that of a black subject having the lowest luminance. Accordingly, in the eleventh embodiment, a color of the inner surface of the frame provided on the edge of the recording area 13 is set to black, thereby increasing accuracy in recognizing a shape of an unspecified writing implement and detecting coordinates of a contact point of the unspecified writing implement on a recorded material.
In a case that a sheet of paper and the like are not used as the recording area 13, a frame is provided on the edge of the recording area 13 so that a user can easily distinguish the recording area 13. As shown in
A description will now be given of a twelfth embodiment of the present invention. An amount of electric charge stored in each pixel on a CMOS image sensor in a unit time by photoelectric conversion depends on an amount of light irradiated onto an imaging device. In other words, the amount of electric charge stored in each pixel by the photoelectric conversion in a unit time increases as an amount of incident light to the imaging device increases, and thus a frame rate outputted as image signals can be increased. In addition, a user writes information on a recorded material occasionally in a place where a lighting environment is insufficient. Accordingly, in the twelfth embodiment, accuracy in recognizing a shape of a writing implement and detecting a contact point of the writing implement on the recorded material is increased by irradiating light against the recording area 13 located on the recorded material. When the writing input device 1 is powered on, the LED 9, the LED 10, the LED 11, and the LED 12 are supplied with electric current, and turned on. Consequently, the amount of incident light irradiated to the CMOS image sensors 16 and 21 increases. Thus, the amount of electric charge stored in each pixel by the photoelectric conversion in a unit time increases so that a frame rate outputted as image signals can be increased.
Accordingly, accuracy in recognizing a shape of an unspecified writing implement and detecting coordinates of a contact point of the unspecified writing implement on a recording surface can be increased. The twelfth embodiment becomes very effective especially in a case that the writing input device 1 is used in a place where the lighting environment is insufficient. The description has been given of the twelfth embodiment in which each LED is turned on when the writing input device 1 is powered on. Alternatively, each LED in the writing input device 1 may be supplied with a switch, and may be turned on when a switch corresponding the LED is pressed down.
In a thirteenth embodiment, in a case that a resolution of reading coordinates is low, by setting an entire reading area (the recording area 13) as a part of a page of a data area, written data such as letters inputted by use of a writing implement is combined with a page of written data, and is displayed on an image-display unit, or is stored as a file. Additionally, in a case that a small image-display device that can only display a small number of pixels is used as an image-display device for displaying the written data, the thirteenth embodiment enables displaying images on the image-display device by dividing a page of the data area with a large number of pixels, and writing information in the divided data area. In other words, since a page of the data area is divided into a plurality of blocks, wherein one of the blocks is assigned to a recording area, a page of written data can be created by inputting information in the recording area a number of times even if a resolution of reading information written in the recording area is low.
A range of the recording area is limited to an area where a writing implement can be recognized, as shown in
A description will now be given of a method of managing written data. The written data is managed by a page having a fixed size. A recording area can be set to any size so that a size of the page can be set to any value. In the thirteenth embodiment, it is assumed that a data size (the number of pixels) of the page is set to an A4 data size (the number of pixels). The number of pixels included in the page is set to an 864-pixel width by an 1140-pixel length based on an ITU-T T.4 related to a document of a group-3 facsimile device. The above-described size of the page is equivalent to 100 dpi (dot per inch) in the width and the length of the page. A page of written data or a plurality of pages of written data may be stored as one file in the flash memory 26.
The description has been given of the method of obtaining coordinates of contact points or writing positions of a writing implement by using the equations EQ1 through EQ4 when units are placed in positions shown in
In the above-described method, a resolution of reading information written at the writing position depends on a distance measured from the writing position to the left camera unit 2 or to the right camera unit 3. A concrete description will be given of a relation between the distance and the resolution with reference to
λ1=d1*tanθ EQ5
λ2=d2*tanθ EQ6
It is obvious from the above equations EQ5 and EQ6 that distance ranges λ1 and λ2 of a subject photographed by pixels E and F (pixels where the points E and D are located), which are adjacent to each other on the surface of the CMOS image sensor, vary depending on the distances d1 and d2 from the wide-angle lens. The pixels E and F photograph the subject located at the distance d1 from the point G in the range λ1. On the other hand, the pixels E and F photograph the subject located at the distance d2 from the point G in the range λ2. This indicates that accuracies in reading coordinates of the writing position are the same in a case of writing information from the point A to the point B and in a case of writing information from the point C to the point D.
A resolution measured in dpi of reading the coordinates of the writing position on the line AB is obtained by dividing one inch that is a unit length by the distance λ1 (inch). In addition, a resolution of reading the coordinates of the writing position on the line CD is obtained by dividing one inch by the distance λ2 (inch). In a case that the number of pixels that can be read by one of the left camera unit 2 and the right camera unit 3 in a horizontal direction of the CMOS image sensors 16 and 21, that is, a direction parallel to the recording surface 38 in a photographed image shown in
Now, it is assumed that the LCD 8 can display 432 pixels in the horizontal direction and 285 pixels in the vertical direction. In a case of displaying a page of written data whose size is the 864-pixel width by the 1140-pixel length on the LCD 8, the page must be divided into several blocks.
A description will now be given of processes performed in the thirteenth embodiment in a case of writing information in one of the writing blocks 52 created as described above.
x_dot=x/(an interval of recording coordinates in the x-axis direction) EQ7
y_dot=y/(an interval of recording coordinates in the Y-axis direction) EQ8
Subsequently, by setting the top left corner of a page of written data as the origin of the page, an origin of an “n”th writing block 52 can be expressed as coordinates (x_org(n), y_org(n)). A range of the value “n” is from 0 to 15. At a step S1109, coordinates (x_dot_page, y_dot_page) of the writing position in a page of written data are obtained from the following equations.
x_dot_page=x_org(n)+x_dot EQ9
y_dot_page=y_org(n)+y_dot EQ10
The writing input device 1 manages the coordinates (x_dot_page, y_dot_page) expressed in pixels as written data. For instance, in a case that information is written at coordinates (100, 200) in a tenth writing block, coordinates (100, 200) in the tenth writing block are converted into coordinates (316, 770) in a page. It should be noted that coordinates of the origin of the tenth writing block are (216, 570).
At a step S1110, the CPU 23 stores a series of coordinate data of a contact point that has been obtained while the writing implement was contacting the recording surface in one of the writing blocks in a memory as written data of the writing block in a page of written data. In addition, the CPU 23 creates depiction data from the series of the coordinate data by use of a method of connecting each set of coordinates with a straight line, for example, and displays the depiction data on the LCD 8 through the LCD display-control unit 28 at a step S1111.
It is assumed in the thirteenth embodiment that the resolution of reading coordinates is even throughout the entire recording area when obtaining the coordinates (x_dot, y_dot) expressed in pixels in order to simplify the description. However, in reality, the resolution of reading the coordinates varies depending on a distance from the left camera unit 2 or the right camera unit 3 to the coordinates (x_dot, y_dot). A method of making a pixel density of written data even throughout the entire recording area will be described later in other embodiments.
Additionally, the following equations EQ11 and EQ12 can be substituted for the equations EQ7 and EQ8 for obtaining the coordinates (x_dot, y_dot) expressed in pixels from the coordinates (x, y).
x_dot=864×(x215)×(¼) EQ11
y_dot=1140×(y290)×(¼) EQ12
The equations EQ11 and EQ12 are derived from the following facts. The numbers of pixels in the width and the length of a page of written data are respectively 864 pixels and 1140 pixels. The width and the length of the page are respectively 215 millimeters and 290 millimeters. In addition, the width and the length of one of the writing blocks corresponding to the recording area are ¼ of the width and the length of the page respectively.
In a fourteenth embodiment of the present invention, the recording area is assigned to any area in a page of a data area. Additionally, in a case that the resolution of reading coordinates in the recording area is low, a page of written data is created by writing data in the recording area several times. In other words, in the above-described thirteenth embodiment, the description has been given of the method of dividing the number of pixels in a page of the data area into 8 blocks, displaying one of the blocks on the LCD 8, and assigning half of a block to the recording area. On the other hand, in the fourteenth embodiment, a description will be given of a method of assigning a pixel-displaying area of the LCD 8 to any area in a page of the data area, and then assigning a part of the pixel-displaying area to the recording area. Similarly to the thirteenth embodiment, the LCD 8 can display 432 pixels in the horizontal direction and 285 pixels in the vertical direction. A location of the pixel-displaying area of the LCD 8 can be moved freely to any location in a page of the data area by a user operation. A method of moving the pixel-displaying area of the LCD 8 will be described later in other embodiments.
x_dot_page=x_org+x_dot EQ13
y_dot_page=y_org+y_dot EQ14
For example, if data is written at coordinates (100, 200) in the displaying area 54 for writing shown in
As described above, the fourteenth embodiment is characterized by the function to assign the pixel-displaying area 53 to any location in a page of the data area. After a location of the pixel-displaying area 53 has been set in a page of the data area, the steps S1110 and S1111 shown
In a fifteenth embodiment, an area corresponding to a recording area is displayed on a displaying unit distinctively so that a user can recognize easily which part of a page of a data area he or she is currently writing data in, thereby improving operability of the writing input device 1. In the above-described thirteenth embodiment, in a case that a writing block corresponding to the recording area is displayed on a monitor of the LCD 8, the writing block is displayed in the right or left half of a displaying block as shown in
In a sixteenth embodiment, a description will be given of a method of displaying an area corresponding to a recording area on a displaying unit distinctively so that a user can recognize easily which part of a page of a data area he or she is currently writing data in, thereby improving operability of the writing input device 1. In the thirteenth embodiment, the size of a writing block is set to exactly 1/16 of the size of a page of the data area as shown in
A description will now be given of a seventeenth embodiment of the present invention. In the seventeenth embodiment, writing data in any part of a page of a data area is enabled since any of writing blocks corresponding to a recording area can be selected, thereby improving operability of the writing input device 1.
According to the seventeenth embodiment, writing data in any part of a page of the data area is enabled since any of the writing blocks corresponding to the recording area can be selected, thereby improving operability of the writing input device 1.
An eighteenth embodiment enables free movement of an area corresponding to the recording area in a page of a data area, and thus simplifies writing data in any part of a page of the data area, thereby improving operability of the writing input device 1. The direction-specifying key is used for selecting a writing block in the seventeenth embodiment. On the other hand, the direction-specifying key is used for moving the pixel-displaying area 53 and the displaying area 54 for writing by a pixel in the eighteenth embodiment. The pixel-displaying area 53 of the LCD 8 and the displaying area 54 for writing that corresponds to the recording area in the pixel-displaying area 53 are shown in
A description will now be given of a case of moving the displaying area 61 for writing in the pixel-displaying area 60. It should be noted that the pixel-displaying area 53 corresponds to the pixel-displaying area 60. In addition, the displaying area 54 for writing corresponds to the displaying area 61 for writing. The writing input device 1 initially checks whether the moving-area selecting key has been pressed at the step S1401 of
According to the eighteenth embodiment, an area corresponding to the recording area can be moved freely in a page of the data area, thereby improving operability of the writing input device 1.
A description will now be given of a nineteenth embodiment of the present invention. In a case of using a lens for the left camera unit 2 and the right camera unit 3, a larger distance from one of the left camera unit 2 and the right camera unit 3 produces a lower resolution of reading coordinates. Thus, the nineteenth embodiment provides a method of preventing deterioration of an image caused by lack of written data by executing an interpolation process against the written data in an area where the resolution of reading coordinates is low. Additionally, the nineteenth embodiment provides a method of making pixel density and image quality of written data obtained by a writing implement moving a certain distance substantially even throughout the entire recording area, by executing an interpolation process or a decimation process properly based on a resolution of reading coordinates at a writing position of the writing implement.
In the nineteenth embodiment, the writing input device 1 manages a page of written data in the number of pixels corresponding to 100 dpi, that is, the 864-pixel width by the 1140-pixel length. Since the size of the recording area corresponds to 1/16 of the size of a page of the written data as shown in
In a case that the number of pixels read by a CMOS image sensor provided in each camera unit in the horizontal direction (a direction parallel to a recording surface of an image photographed by each camera unit) is 640 pixels, the resolution of reading coordinates at the points R and S in the A4-sized recording area shown in
A description will now be given of a method of calculating a resolution of reading coordinates in the recording area.
A point Ti is provided in the recording area so that image-formed locations of the points T and T1 are at pixels adjacent to each other on the CMOS image sensor 16 of the left camera unit 2. Similarly, a point T2 is provided in the recording area so that image-formed locations of the points T and T2 are at pixels adjacent to each other in the CMOS image sensor 21 of the right camera unit 3. An angle θ corresponds to a photographing range of each pixel on the CMOS image sensors 16 and 21, and depends on a view angle of each of the left camera unit 2 and the right camera unit 3. Lengths of lines PT, QT, TT1 and TT2 are named k1, k2, L1 and L2 respectively. An angle formed by lines PT1 and TT1, and an angle formed by lines QT2 and TT2 are set to 90 degrees. In addition, coordinates of the point T are set to (x, y). Accordingly, values of lengths k1 and k2 are obtained as follows.
k1=√(x2+y2) EQ15
k2=√((215−x)2+y2) EQ16
Subsequently, values of lengths L1 and L2 can be obtained from the following equations.
L1=k1*sinθ EQ17
L2=k2*sinθ EQ18
The coordinates (x, y) of the point T obtained from the equations EQ1 through EQ4 are expressed in millimeters so that the lengths L1 and L2 obtained from the equations EQ17 and EQ18 are expressed also in millimeters. Thus, a unit of the lengths L1 and L2 should be converted from a millimeter to an inch by use of a fact that one-millimeter is equal to about 0.03937 inch. Subsequently, a resolution Rdiv_L of reading coordinates by the left camera unit 2 and a resolution Rdiv_R of reading coordinates by the right camera unit 3 at the point T are obtained respectively by dividing one inch by the length L1 expressed in inches and by the length 2 expressed in inches.
Rdiv_L=1/(L1×0.03937) EQ19
Rdiv_R=1/(L2×0.03937) EQ20
Since the point T is provided in the recording area so that the value of k1 is greater than the value of k2 as shown in
Rdiv_R>Rdiv_L EQ21
As seen in the equation EQ21, the resolution of reading coordinates by the right camera unit 3 is higher than the resolution of reading coordinates by the left camera unit 2.
As described above, the resolution of reading coordinates by the left camera unit 2 is different from the resolution of reading coordinates by the right camera unit 3 at any point except a point where the values of k1 and k2 are identical. Accordingly, the writing input device 1 calculates a vector between each coordinate data belonging to a series of the coordinate data, and decides to use one of resolutions of reading coordinates by the left camera unit 2 and by the right camera unit 3 based on a direction of the vector between each set of coordinates, for executing the decimation process on written data. For example, the resolution of reading coordinates by the left camera unit 2 is used when a user is writing data in a top-right direction as shown in
A description will now be given of a method of executing an interpolation process on written data. The above-described processes shown in
Accordingly, the writing input device 1 generates written data at 50 dpi by executing the interpolation process on written data, that is, a series of coordinates read by the CMOS image sensors 16 and 21, in a case that the resolution of reading coordinates is lower than 50 dpi. A spline-curve method and a Bezier-curve method are used in the interpolation process executed on the written data. The spline-curve method is a method of interpolating coordinates at a fixed interval on a curve after obtaining the curve that includes all the coordinates obtained by the CMOS image sensors 16 and 21 thereon. The Bezier-curve method is a method of interpolating coordinates at a fixed interval on a curve after obtaining the curve that includes first and last coordinates provided in a series of coordinates of a writing position thereon by using coordinates located between the first and last coordinates only for deciding a shape of the curve. In other words, the first and last coordinates are on a Bezier curve. However, other coordinates located between the first and last coordinates are not necessarily on the Bezier curve. As described above, the writing input device 1 executes the interpolation process on written data in a case that a resolution of reading coordinates is less than 50 dpi according to the writing position. Alternatively, the writing input device 1 executes the decimation process on written data in a case that a resolution of reading coordinates is higher than 50 dpi. Subsequently, the writing input device 1 adds the written data having its pixel density being 200 dpi to a recording area corresponding to the written data in a page of written data.
A description will now be given of a twentieth embodiment of the present invention. The twentieth embodiment provides a method of implementing the present invention by use of software.
As described above, the present invention enables use of a desired writing implement and a desired recorded material whose recording surface is a plane for a writing input device. Additionally, according to the present invention, the writing input device can record data in a recording medium in real time while the data is being written by use of a writing implement on the recording surface. Additionally, the writing input device can create and manage a page of written data by dividing the page into a plurality of areas, and then by assigning each of the areas to a recording area even in a case that a resolution of reading coordinates is low in the recording area. Additionally, the writing input device can control a pixel density of the written data to be even throughout the entire recording area. Furthermore, the writing input device can control displaying the recording area on a LCD for improving operability of writing data in the recording area.
The above description is provided in order to enable any person skilled in the art to make and use the invention and sets forth the best mode contemplated by the inventors of carrying out the invention.
The present invention is not limited to the specially disclosed embodiments and variations, and modifications may be made without departing from the scope and spirit of the invention.
The present application is based on Japanese Priority Application No. 11-369699, filed on Dec. 27, 1999, the entire contents of which are hereby incorporated by reference.
Patent | Priority | Assignee | Title |
8619061, | Apr 17 2009 | Raydium Semiconductor Corporation | Optical touch apparatus and operating method thereof |
Patent | Priority | Assignee | Title |
4107522, | Nov 11 1975 | Erwin Sick Gesellschaft mit beschrankter Haftung Optik-Elektronik | Rotary beam light curtain |
4144449, | Jul 08 1977 | Sperry Rand Corporation | Position detection apparatus |
4217649, | Oct 11 1978 | Digitizer for locating the position of a stylus point on a writing surface | |
4247767, | May 05 1978 | Her Majesty the Queen in right of Canada, as represented by the Minister | Touch sensitive computer input device |
4507557, | Apr 01 1983 | Siemens Corporate Research & Support, Inc. | Non-contact X,Y digitizer using two dynamic ram imagers |
4558313, | Dec 31 1981 | International Business Machines Corporation | Indicator to data processing interface |
4672364, | Jun 18 1984 | Carroll Touch Inc | Touch input device having power profiling |
4737631, | May 17 1985 | ALPS Electric Co., Ltd. | Filter of photoelectric touch panel with integral spherical protrusion lens |
4742221, | May 17 1985 | ALPS Electric Co., Ltd. | Optical coordinate position input device |
4746770, | Feb 17 1987 | Sensor Frame Incorporated; SENSOR FRAME INCORPORATION | Method and apparatus for isolating and manipulating graphic objects on computer video monitor |
4762990, | Oct 21 1985 | International Business Machines Corporation; INTERNATIONAL BUSINESS MACHINES CORPORATION, ARMONK, NEW YORK, 10504, A CORP OF | Data processing input interface determining position of object |
4782328, | Oct 02 1986 | Product Development Services, Incorporated | Ambient-light-responsive touch screen data input method and system |
4818826, | Sep 19 1986 | ALPS Electric Co., Ltd. | Coordinate input apparatus including a detection circuit to determine proper stylus position |
4820050, | Apr 28 1987 | Wells-Gardner Electronics Corporation | Solid-state optical position determining apparatus |
4822145, | May 14 1986 | Massachusetts Institute of Technology | Method and apparatus utilizing waveguide and polarized light for display of dynamic images |
4831455, | Feb 21 1986 | Canon Kabushiki Kaisha | Picture reading apparatus |
4868912, | Nov 26 1987 | LUCAS AUTOMATION & CONTROL ENGINEERING, INC | Infrared touch panel |
4980547, | May 24 1985 | Wells-Gardner Electronics Corp. | Light distribution and detection apparatus |
5025314, | Jul 30 1990 | XEROX CORPORATION, A CORP OF NY | Apparatus allowing remote interactive use of a plurality of writing surfaces |
5097516, | Feb 28 1991 | AT&T Bell Laboratories | Technique for illuminating a surface with a gradient intensity line of light to achieve enhanced two-dimensional imaging |
5109435, | Aug 08 1988 | Hughes Aircraft Company | Segmentation method for use against moving objects |
5130794, | Mar 29 1990 | INTELLECTUAL VENTURS FUND 59 LLC; INTELLECTUAL VENTURES FUND 59 LLC | Panoramic display system |
5140647, | Dec 18 1989 | Hitachi, Ltd. | Image joining method and system |
5162618, | Nov 16 1990 | WHITAKER CORPORATION, THE | Acoustic touch position sensor with first order lamb wave reflective arrays |
5168531, | Jun 27 1991 | HEWLETT-PACKARD DEVELOPMENT COMPANY, L P | Real-time recognition of pointing information from video |
5196835, | Sep 30 1988 | International Business Machines Corporation | Laser touch panel reflective surface aberration cancelling |
5239373, | Dec 26 1990 | XEROX CORPORATION, A CORP OF NY ; XEROX CORPORATION, A CORP OF NEW YORK; XEROX CORPORATION, STAMFORD, COUNTY OF FAIRFIELD, CONNECTICUT A CORP OF NEW YORK | Video computational shared drawing space |
5317140, | Nov 24 1992 | NEXT HOLDINGS, LTD | Diffusion-assisted position location particularly for visual pen detection |
5359155, | Mar 25 1993 | Tiger Scientific Corp.; TIGER SCIENTIFIC CORPORATION | Illumination apparatus for a digitizer tablet |
5374971, | Mar 12 1993 | Polycom, Inc | Two-view video camera stand and support method |
5414413, | Jun 14 1988 | Sony Corporation | Touch panel apparatus |
5448263, | Oct 21 1991 | SMART Technologies ULC | Interactive display system |
5483261, | Feb 14 1992 | ORGPRO NEXUS INC | Graphical input controller and method with rear screen image detection |
5483603, | Oct 22 1992 | Advanced Interconnection Technology | System and method for automatic optical inspection |
5484966, | Dec 07 1993 | AT&T Corp. | Sensing stylus position using single 1-D image sensor |
5490655, | Sep 16 1993 | Monger Mounts, Inc. | Video/data projector and monitor ceiling/wall mount |
5502568, | Mar 23 1993 | WACOM CO , LTD | Optical position detecting unit, optical coordinate input unit and optical position detecting method employing a pattern having a sequence of 1's and 0's |
5525764, | Jun 09 1994 | DIGITAL SCANNING SYSTEMS, INC | Laser scanning graphic input system |
5528263, | Jun 15 1994 | Daniel M., Platzker | Interactive projected video image display system |
5528290, | Sep 09 1994 | Xerox Corporation | Device for transcribing images on a board using a camera based board scanner |
5537107, | Jan 29 1991 | Sony Corporation | Remote control unit for video apparatus |
5554828, | Jan 03 1995 | Texas Instruments Inc. | Integration of pen-based capability into a field emission device system |
5581276, | Sep 08 1992 | Kabushiki Kaisha Toshiba | 3D human interface apparatus using motion recognition based on dynamic image processing |
5581637, | Dec 09 1994 | Xerox Corporation | System for registering component image tiles in a camera-based scanner device transcribing scene images |
5594469, | Feb 21 1995 | Mitsubishi Electric Research Laboratories, Inc | Hand gesture machine control system |
5594502, | Jan 20 1993 | Elmo Company, Limited | Image reproduction apparatus |
5617312, | Nov 19 1993 | GOOGLE LLC | Computer system that enters control information by means of video camera |
5638092, | Dec 20 1994 | Cursor control system | |
5670755, | Apr 21 1994 | SAMSUNG DISPLAY CO , LTD | Information input apparatus having functions of both touch panel and digitizer, and driving method thereof |
5686942, | Dec 01 1994 | National Semiconductor Corporation | Remote computer input system which detects point source on operator |
5729704, | Jul 21 1993 | Xerox Corporation | User-directed method for operating on an object-based model data structure through a second contextual image |
5734375, | Jun 07 1995 | HEWLETT-PACKARD DEVELOPMENT COMPANY, L P | Keyboard-compatible optical determination of object's position |
5736686, | Mar 01 1995 | GTCO Corporation | Illumination apparatus for a digitizer tablet with improved light panel |
5737740, | Jun 27 1994 | Numonics | Apparatus and method for processing electronic documents |
5745116, | Sep 09 1996 | Google Technology Holdings LLC | Intuitive gesture-based graphical user interface |
5745591, | Dec 29 1995 | System and method for verifying the identity of a person | |
5764223, | Jun 07 1995 | AU Optronics Corporation | Touch-screen input device using the monitor as a light source operating at an intermediate frequency |
5771039, | Jun 06 1994 | NETAIRUS SYSTEMS LLC | Direct view display device integration techniques |
5790910, | Aug 04 1997 | Peerless Industries, Inc. | Camera mounting apparatus |
5801704, | Aug 22 1994 | Hitachi, Ltd. | Three-dimensional input device with displayed legend and shape-changing cursor |
5818421, | Dec 21 1994 | Hitachi, Ltd. | Input interface apparatus for large screen display |
5818424, | Oct 19 1995 | International Business Machines Corporation | Rod shaped device and data acquisition apparatus for determining the position and orientation of an object in space |
5819201, | Sep 13 1996 | BEACON NAVIGATION GMBH | Navigation system with vehicle service information |
5825352, | Jan 04 1996 | ELAN MICROELECTRONICS CORP | Multiple fingers contact sensing method for emulating mouse buttons and mouse operations on a touch sensor pad |
5825666, | Jun 07 1995 | VISICON TECHNOLOGIES, INC | Optical coordinate measuring machines and optical touch probes |
5831602, | Jan 11 1996 | Canon Kabushiki Kaisha | Information processing apparatus, method and computer program product |
5911004, | May 08 1995 | Ricoh Company, LTD | Image processing apparatus for discriminating image characteristics using image signal information obtained in an image scanning operation |
5914709, | Mar 14 1997 | POA Sana Liquidating Trust | User input device for a computer system |
5920342, | Sep 16 1994 | Kabushiki Kaisha Toshiba | Image input apparatus for capturing images of multiple resolutions |
5936615, | Sep 12 1996 | HTC Corporation | Image-based touchscreen |
5943783, | Sep 04 1992 | Balco, Incorporated | Method and apparatus for determining the alignment of motor vehicle wheels |
5963199, | Feb 09 1996 | Kabushiki Kaisha Sega Enterprises | Image processing systems and data input devices therefor |
5982352, | Aug 15 1994 | Method for providing human input to a computer | |
5988645, | Apr 08 1994 | Moving object monitoring system | |
6002808, | Jul 26 1996 | Mitsubishi Electric Research Laboratories, Inc | Hand gesture control system |
6008798, | Jun 07 1995 | HEWLETT-PACKARD DEVELOPMENT COMPANY, L P | Method of determining an object's position and associated apparatus |
6031531, | Apr 06 1998 | International Business Machines Corporation | Method and system in a graphical user interface for facilitating cursor object movement for physically challenged computer users |
6061177, | Dec 19 1996 | Integrated computer display and graphical input apparatus and method | |
6075905, | Jul 17 1996 | Sarnoff Corporation | Method and apparatus for mosaic image construction |
6100538, | Jun 13 1997 | Kabushikikaisha Wacom | Optical digitizer and display means for providing display of indicated position |
6104387, | May 14 1997 | Mimio, LLC | Transcription system |
6118433, | Jan 30 1992 | SMART Technologies ULC | Large-scale, touch-sensitive video display |
6122865, | Mar 13 1997 | STEELCASE DEVELOPMENT INC | Workspace display |
6128003, | Dec 20 1996 | Hitachi, Ltd. | Hand gesture recognition system and method |
6141000, | Oct 21 1991 | Smart Technologies Inc. | Projection display system with touch sensing on screen, computer assisted alignment correction and network conferencing |
6144375, | Aug 14 1998 | CLOUD SOFTWARE GROUP, INC | Multi-perspective viewer for content-based interactivity |
6147678, | Dec 09 1998 | WSOU Investments, LLC | Video hand image-three-dimensional computer interface with multiple degrees of freedom |
6153836, | Apr 02 1997 | CYBERSCAN TECHNOLOGIES, INC | Adjustable area coordinate position data-capture system |
6161066, | Aug 18 1997 | TEXAS A&M UNIVERSITY SYSTEM, THE | Advanced law enforcement and response technology |
6179426, | Mar 03 1999 | 3M Innovative Properties Company | Integrated front projection system |
6188388, | Dec 28 1993 | Hitachi, Ltd. | Information presentation apparatus and information display apparatus |
6191773, | Apr 28 1995 | Panasonic Intellectual Property Corporation of America | Interface apparatus |
6208329, | Aug 13 1996 | AVAGO TECHNOLOGIES GENERAL IP SINGAPORE PTE LTD | Supplemental mouse button emulation system, method and apparatus for a coordinate based data input device |
6208330, | Mar 07 1997 | Canon Kabushiki Kaisha | Coordinate input apparatus and its control method |
6209266, | Mar 13 1997 | Steelcase Development Inc. | Workspace display |
6226035, | Mar 04 1998 | Remotereality Corporation | Adjustable imaging system with wide angle capability |
6229529, | Jul 11 1997 | Ricoh Company, LTD | Write point detecting circuit to detect multiple write points |
6236736, | Feb 07 1997 | NCR Voyix Corporation | Method and apparatus for detecting movement patterns at a self-service checkout terminal |
6252989, | Jan 07 1997 | Board of the Regents, The University of Texas System | Foveated image coding system and method for image bandwidth reduction |
6256033, | Oct 15 1997 | Microsoft Technology Licensing, LLC | Method and apparatus for real-time gesture recognition |
6262718, | Jan 19 1994 | Lenovo PC International | Touch-sensitive display apparatus |
6310610, | Dec 04 1997 | Microsoft Technology Licensing, LLC | Intelligent touch display |
6323846, | Jan 26 1998 | Apple Inc | Method and apparatus for integrating manual input |
6328270, | Nov 12 1999 | Elbex Video Ltd.; ELBEX VIDEO LTD | Swivel joint with cable passage for a television camera or a case |
6335724, | Jan 29 1999 | Ricoh Company, Ltd. | Method and device for inputting coordinate-position and a display board system |
6337681, | Oct 21 1991 | Smart Technologies Inc. | Projection display system with pressure sensing at screen, and computer assisted alignment implemented by applying pressure at displayed calibration marks |
6339748, | Nov 11 1997 | Seiko Epson Corporation | Coordinate input system and display apparatus |
6353434, | Sep 08 1998 | Gunze Limited | Input coordinate transformation apparatus for converting coordinates input from a coordinate input device into coordinates in a display coordinate system for displaying images on a display |
6359612, | Sep 30 1998 | Siemens Aktiengesellschaft | Imaging system for displaying image information that has been acquired by means of a medical diagnostic imaging device |
6414671, | Jun 08 1992 | Synaptics Incorporated | Object position detector with edge motion feature and gesture recognition |
6414673, | Nov 10 1998 | ALL QUALITY AND SERVICES, INC | Transmitter pen location system |
6421042, | Jun 09 1998 | SMART Technologies ULC | Coordinate position inputting/detecting device, a method for inputting/detecting the coordinate position, and a display board system |
6427389, | Mar 13 1997 | Steelcase Inc | Workspace display |
6429856, | May 11 1998 | Ricoh Company, LTD | Coordinate position inputting/detecting device, a method for inputting/detecting the coordinate position, and a display board system |
6496122, | Jun 26 1998 | Sharp Laboratories of America, Inc | Image display and remote control system capable of displaying two distinct images |
6497608, | Feb 09 2001 | Sampo Technology Corp. | Toy car camera system and rear vision mirrors |
6498602, | Nov 11 1999 | MINATO ADVANCED TECHNOLOGIES INC | Optical digitizer with function to recognize kinds of pointing instruments |
6507339, | Sep 23 1999 | Ricoh Company, Ltd. | Coordinate inputting/detecting system and a calibration method therefor |
6512838, | Sep 22 1999 | Microsoft Technology Licensing, LLC | Methods for enhancing performance and data acquired from three-dimensional image systems |
6517266, | May 15 2001 | Xerox Corporation | Systems and methods for hand-held printing on a surface or medium |
6518600, | Nov 17 2000 | General Electric Company | Dual encapsulation for an LED |
6522830, | Nov 30 1993 | Canon Kabushiki Kaisha | Image pickup apparatus |
6529189, | Feb 08 2000 | Toshiba Global Commerce Solutions Holdings Corporation | Touch screen stylus with IR-coupled selection buttons |
6530664, | Mar 03 1999 | 3M Innovative Properties Company | Integrated front projection system with enhanced dry erase screen configuration |
6531959, | Jul 13 1999 | Honda Giken Kogyo Kabushiki Kaisha | Position detecting device |
6531999, | Jul 13 2000 | Koninklijke Philips Electronics N V | Pointing direction calibration in video conferencing and other camera-based system applications |
6545669, | Mar 26 1999 | Object-drag continuity between discontinuous touch-screens | |
6556307, | Sep 11 1998 | Minolta Co., Ltd. | Method and apparatus for inputting three-dimensional data |
6559813, | Jul 01 1998 | Vuzix Corporation | Selective real image obstruction in a virtual reality display apparatus and method |
6563491, | Sep 10 1999 | SMART Technologies ULC | Coordinate input apparatus and the recording medium thereof |
6567078, | Jan 25 2000 | XIROKU INC | Handwriting communication system and handwriting input device used therein |
6567121, | Oct 25 1996 | Canon Kabushiki Kaisha | Camera control system, camera server, camera client, control method, and storage medium |
6570612, | Sep 21 1998 | Bank One, NA, as Administrative Agent | System and method for color normalization of board images |
6577299, | Aug 18 1998 | CANDLEDRAGON, INC | Electronic portable pen apparatus and method |
6587099, | Feb 18 2000 | Ricoh Company, LTD | Coordinate input/detection device detecting installation position of light-receiving device used for detecting coordinates |
6590877, | Dec 24 1997 | Casio Computer Co., Ltd. | Data transmission device |
6594023, | Sep 10 1999 | Ricoh Company, LTD | Coordinate inputting/detecting apparatus, method and computer program product designed to precisely recognize a designating state of a designating device designating a position |
6597348, | Dec 28 1998 | SEMICONDUCTOR ENERGY LABORATORY CO , LTD | Information-processing device |
6608619, | May 11 1998 | Ricoh Company, Ltd. | Coordinate position inputting/detecting device, a method for inputting/detecting the coordinate position, and a display board system |
6614932, | Apr 05 1999 | Iwane Laboratories Ltd. | Information converting system |
6626718, | Oct 03 2000 | Canon Kabushiki Kaisha | Apparatus for manufacturing electron source, method for manufacturing electron source, and method for manufacturing image-forming apparatus |
6630922, | Aug 29 1997 | Xerox Corporation | Handedness detection for a physical manipulatory grammar |
6633328, | Jan 05 1999 | Steris Corporation | Surgical lighting system with integrated digital video camera |
6650822, | Oct 29 1996 | Life Technologies Corporation | Optical device utilizing optical waveguides and mechanical light-switches |
6674424, | Oct 29 1999 | SMART Technologies ULC | Method and apparatus for inputting information including coordinate data |
6683584, | Oct 22 1993 | Kopin Corporation | Camera display system |
6690357, | Oct 07 1998 | Intel Corporation | Input device using scanning sensors |
6690363, | Jun 19 2000 | SMART Technologies ULC | Touch panel display system |
6690397, | Jun 05 2000 | ADVANCED NEUROMODULATION SYSTEMS, INC | System for regional data association and presentation and method for the same |
6710770, | |||
6736321, | Mar 24 1998 | Metrologic Instruments, Inc | Planar laser illumination and imaging (PLIIM) system employing wavefront control methods for reducing the power of speckle-pattern noise digital images acquired by said system |
6741250, | Feb 09 2001 | DIGIMEDIA TECH, LLC | Method and system for generation of multiple viewpoints into a scene viewed by motionless cameras and for presentation of a view path |
6747636, | Oct 21 1991 | Smart Technologies, Inc. | Projection display and system with pressure sensing at screen, and computer assisted alignment implemented by applying pressure at displayed calibration marks |
6756910, | Feb 27 2001 | Optex Co., Ltd. | Sensor for automatic doors |
6760009, | Jun 09 1998 | SMART Technologies ULC | Coordinate position inputting/detecting device, a method for inputting/detecting the coordinate position, and a display board system |
6760999, | Mar 13 1997 | Steelcase Inc | Workspace display |
6774889, | Oct 24 2000 | Microsoft Technology Licensing, LLC | System and method for transforming an ordinary computer monitor screen into a touch screen |
6803906, | Jul 05 2000 | SMART Technologies ULC | Passive touch system and method of detecting user input |
6829372, | Dec 27 1999 | SMART Technologies ULC | Information-inputting device inputting contact point of object on recording surface as information |
6864882, | May 24 2000 | SMART Technologies ULC | Protected touch panel display system |
6911972, | Apr 04 2001 | MATSUSHITA ELECTRIC INDUSTRIAL CO , LTD | User interface device |
6919880, | Jun 01 2001 | SMART Technologies ULC | Calibrating camera offsets to facilitate object position determination using triangulation |
6933981, | Jun 25 1999 | Kabushiki Kaisha Toshiba | Electronic apparatus and electronic system provided with the same |
6947032, | Mar 11 2003 | PIXART IMAGING INC | Touch system and method for determining pointer contacts on a touch surface |
6954197, | Nov 15 2002 | SMART Technologies ULC | Size/scale and orientation determination of a pointer in a camera-based touch system |
6972401, | Jan 30 2003 | SMART Technologies ULC | Illuminated bezel and touch system incorporating the same |
6972753, | Oct 02 1998 | SEMICONDUCTOR ENERGY LABORATORY CO , LTD | Touch panel, display device provided with touch panel and electronic equipment provided with display device |
7007236, | Sep 14 2001 | Accenture Global Services Limited | Lab window collaboration |
7015418, | May 17 2002 | BARCLAYS BANK PLC, AS COLLATERAL AGENT | Method and system for calibrating a laser processing system and laser marking system utilizing same |
7030861, | Feb 10 2001 | Apple Inc | System and method for packing multi-touch gestures onto a hand |
7084868, | Apr 26 2000 | UNIVERSITY OF LOUISVILLE RESEARCH FOUNDATION, INC.; University of Louisville Research Foundation, Inc | System and method for 3-D digital reconstruction of an oral cavity from a sequence of 2-D images |
7098392, | Jul 10 1996 | BAMA GAMING | Electronic image visualization system and communication methodologies |
7121470, | Jan 11 2002 | HAND HELD PRODUCTS, INC | Transaction terminal having elongated finger recess |
7176904, | Mar 26 2001 | Ricoh Company, Limited | Information input/output apparatus, information input/output control method, and computer product |
7184030, | Dec 02 2003 | PIXART IMAGING INC | Synchronization of cameras in camera-based touch system to enhance position determination of fast moving objects |
7187489, | Oct 05 1999 | SNAPTRACK, INC | Photonic MEMS and structures |
7190496, | Jul 24 2003 | FOVI 3D, INC | Enhanced environment visualization using holographic stereograms |
7202860, | Oct 09 2001 | EIT Co., Ltd.; Xiroku Inc. | Coordinate input device working with at least display screen and desk-top surface as the pointing areas thereof |
7232986, | Feb 17 2004 | Smart Technologies Inc. | Apparatus for detecting a pointer within a region of interest |
7236162, | Jul 05 2000 | The Johns Hopkins University | Passive touch system and method of detecting user input |
7274356, | Oct 09 2003 | PIXART IMAGING INC | Apparatus for determining the location of a pointer within a region of interest |
7355593, | Jan 02 2004 | SMART Technologies ULC | Pointer tracking across multiple overlapping coordinate input sub-regions defining a generally contiguous input region |
7414617, | Oct 09 2001 | EIT Co., Ltd.; Xiroku Inc. | Coordinate input device working with at least display screen and desk-top surface as the pointing areas thereof |
7619617, | Nov 15 2002 | SMART Technologies ULC | Size/scale and orientation determination of a pointer in a camera-based touch system |
7692625, | Jul 05 2000 | SMART TECHNOLOGIES INC | Camera-based touch system |
20010019325, | |||
20010022579, | |||
20010026268, | |||
20010033274, | |||
20020036617, | |||
20020050979, | |||
20020067922, | |||
20020080123, | |||
20020126161, | |||
20020145595, | |||
20020163530, | |||
20030001825, | |||
20030025951, | |||
20030043116, | |||
20030046401, | |||
20030063073, | |||
20030071858, | |||
20030085871, | |||
20030095112, | |||
20030142880, | |||
20030151532, | |||
20030151562, | |||
20040021633, | |||
20040031779, | |||
20040046749, | |||
20040108990, | |||
20040149892, | |||
20040150630, | |||
20040169639, | |||
20040178993, | |||
20040178997, | |||
20040179001, | |||
20040189720, | |||
20040252091, | |||
20050052427, | |||
20050057524, | |||
20050083308, | |||
20050151733, | |||
20050190162, | |||
20050248540, | |||
20050276448, | |||
20060022962, | |||
20060158437, | |||
20060202953, | |||
20060227120, | |||
20060274067, | |||
20070019103, | |||
20070075648, | |||
20070075982, | |||
20070116333, | |||
20070126755, | |||
20070139932, | |||
20070236454, | |||
20080062149, | |||
20080129707, | |||
CA2412878, | |||
CA2493236, | |||
DE19810452, | |||
EP279652, | |||
EP347725, | |||
EP657841, | |||
EP762319, | |||
EP829798, | |||
EP1297488, | |||
EP1450243, | |||
GB2204126, | |||
JP10078844, | |||
JP10105324, | |||
JP11051644, | |||
JP11064026, | |||
JP11085376, | |||
JP11110116, | |||
JP2000105671, | |||
JP2000132340, | |||
JP2001075735, | |||
JP2001282456, | |||
JP2001282457, | |||
JP2002236547, | |||
JP2003158597, | |||
JP2003167669, | |||
JP2003173237, | |||
JP3054618, | |||
JP4350715, | |||
JP4355815, | |||
JP5181605, | |||
JP5189137, | |||
JP5197810, | |||
JP57211637, | |||
JP61196317, | |||
JP61260322, | |||
JP6266498, | |||
JP6289989, | |||
JP7044650, | |||
JP7110733, | |||
JP7230352, | |||
JP8016931, | |||
JP8108689, | |||
JP816931, | |||
JP8240407, | |||
JP8286809, | |||
JP8315152, | |||
JP9091083, | |||
JP9091094, | |||
JP9224111, | |||
JP9319501, | |||
WO203316, | |||
WO207073, | |||
WO227461, | |||
WO3105074, | |||
WO2005106775, | |||
WO2007003196, | |||
WO2007064804, | |||
WO9807112, | |||
WO9908897, | |||
WO9921122, | |||
WO9928812, | |||
WO9940562, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
May 27 2010 | SMART Technologies ULC | (assignment on the face of the patent) | / | |||
Jul 31 2013 | SMART Technologies ULC | MORGAN STANLEY SENIOR FUNDING INC | SECURITY AGREEMENT | 030935 | /0848 | |
Jul 31 2013 | SMART TECHNOLOGIES INC | MORGAN STANLEY SENIOR FUNDING INC | SECURITY AGREEMENT | 030935 | /0848 | |
Oct 03 2016 | MORGAN STANLEY SENIOR FUNDING, INC | SMART Technologies ULC | RELEASE OF ABL SECURITY INTEREST | 040711 | /0956 | |
Oct 03 2016 | MORGAN STANLEY SENIOR FUNDING, INC | SMART TECHNOLOGIES INC | RELEASE OF ABL SECURITY INTEREST | 040711 | /0956 | |
Oct 03 2016 | MORGAN STANLEY SENIOR FUNDING, INC | SMART Technologies ULC | RELEASE OF TERM LOAN SECURITY INTEREST | 040713 | /0123 | |
Oct 03 2016 | MORGAN STANLEY SENIOR FUNDING, INC | SMART TECHNOLOGIES INC | RELEASE OF TERM LOAN SECURITY INTEREST | 040713 | /0123 | |
Oct 03 2016 | MORGAN STANLEY SENIOR FUNDING, INC | SMART Technologies ULC | RELEASE BY SECURED PARTY SEE DOCUMENT FOR DETAILS | 040798 | /0077 | |
Oct 03 2016 | MORGAN STANLEY SENIOR FUNDING, INC | SMART TECHNOLOGIES INC | RELEASE BY SECURED PARTY SEE DOCUMENT FOR DETAILS | 040798 | /0077 |
Date | Maintenance Fee Events |
Nov 01 2011 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Nov 27 2015 | M1552: Payment of Maintenance Fee, 8th Year, Large Entity. |
Nov 27 2019 | M1553: Payment of Maintenance Fee, 12th Year, Large Entity. |
Date | Maintenance Schedule |
Oct 04 2014 | 4 years fee payment window open |
Apr 04 2015 | 6 months grace period start (w surcharge) |
Oct 04 2015 | patent expiry (for year 4) |
Oct 04 2017 | 2 years to revive unintentionally abandoned end. (for year 4) |
Oct 04 2018 | 8 years fee payment window open |
Apr 04 2019 | 6 months grace period start (w surcharge) |
Oct 04 2019 | patent expiry (for year 8) |
Oct 04 2021 | 2 years to revive unintentionally abandoned end. (for year 8) |
Oct 04 2022 | 12 years fee payment window open |
Apr 04 2023 | 6 months grace period start (w surcharge) |
Oct 04 2023 | patent expiry (for year 12) |
Oct 04 2025 | 2 years to revive unintentionally abandoned end. (for year 12) |