In one embodiment, a method of tracking an object carrying a wireless location device comprises recording and storing images from a plurality of cameras corresponding to respective coverage areas having predetermined locations, determining location information associated with the wireless location device, the location information corresponding to one or more of said coverage areas, and determining which of the images correspond to the location information, and retrieving said images.
|
1. A method comprising:
recording and storing still images on an image server from a plurality of still cameras corresponding to respective coverage areas having predetermined locations, wherein the still cameras record the still images at periodic intervals, wherein recording and storing the still images is automatically initiated in response to a predetermined event such that no recording and storing of the still images occurs prior to the predetermined event;
determining, storing and timestamping location information associated with a wireless location device carried by an object, the location information corresponding to one or more of the coverage areas, wherein the location information is determined based at least upon a determined distance of the wireless location device from one or more base stations;
in response to a surveillance request comprising a surveillance time window associated with the object that includes a time period prior to a current time period, determining which timestamped location information associated with the wireless location device carried by the object is within the surveillance time window;
determining which of the still images stored on the image server correspond to the location information associated with the surveillance time window; and
retrieving still images from the image server that are associated with the surveillance time window and that depend upon the location information associated with the object in order to track the object.
15. A system comprising:
a plurality of still cameras arranged to record still images from respective coverage areas having predetermined locations, wherein the still cameras record the still images at periodic intervals;
an image server coupled to the still cameras and arranged to store the still images recorded by the still cameras, wherein the image server is further arranged to automatically initiate recording of the still images in response to a predetermined event such that no recording of the still images occurs prior to the predetermined event;
a plurality of wireless base stations having respective predetermined locations and operable to communicate wirelessly with a wireless location device carried by an object;
a location server arranged to determine, store and timestamp location information associated with a wireless location device carried by the object, the location information corresponding to one or more of the coverage areas, wherein the location information is determined based at least upon a determined distance of the wireless location device from one or more base stations; and
a tracking server arranged to receive a surveillance request comprising a surveillance time window associated with the object that includes a time period prior to a current time period, to determine which of the still images from the image server correspond to the location information based upon timestamps associated with the location information, and to retrieve the still images from the image server that are associated with the surveillance time window and depend upon the location information associated with the object and with timestamps within the surveillance window in order to track the object.
2. The method of
wherein the still images are stored together with respective timestamps that correspond with timestamps of the location information.
3. The method of
wherein determining which of the still images correspond to the location information comprises matching the predetermined locations of the coverage areas with the location information associated with the wireless location device and respective time-stamps associated with the still images and location information.
4. The method of
wherein determining the location information comprises receiving wireless transmissions from the wireless location device.
5. The method of
wherein the one or more base stations have predetermined locations and receive wireless transmissions from the wireless location device.
6. The method of
wherein the location information comprises one or more of estimated position of the wireless location device, coverage area identifier, signal strength of wireless transmissions received from the wireless location device, wireless location device identifier, base station identifier, and GPS coordinates of the wireless location device.
7. The method of
8. The method of
the predetermined event being the determination that the object is located within one of the coverage areas.
9. The method of
determining second location information associated with a second location device carried by a second object, the second location information corresponding to one or more of the coverage areas; and
determining which of the still images correspond to the second location information, and retrieving the still images.
10. The method of
wherein the determined distance of the wireless location device from one or more base stations is based upon a signal strength of a remote device in relation to each base station.
11. The method of
obtaining location information at the wireless location device based upon signals received by the wireless location device from the one or more base stations.
12. The method of
wherein the recording and storing still images on the image server occurs independently in relation to the surveillance request, the determining which still images stored on the image server correspond to the location information associated with the surveillance time window and the retrieving of still images from the image server that are associated with the surveillance time window, such that at least some recorded and stored still images on the image server are not associated with the surveillance request associated with the object and are thus not retrieved from the image server.
13. The method of
wherein the object is lost, and the surveillance request comprises a request to find the lost object.
16. The system of
wherein the location information comprise one or more of estimated position of the wireless location device, coverage area identifier, signal strength of wireless transmissions received from the wireless location device, wireless location device identifier, base station identifier, and GPS coordinates of the wireless location device.
17. The system of
wherein the still images stored on the location server are each associated with a respective timestamp, and the timestamps associated with the still images correspond with timestamps associated with the location information.
18. The system of
wherein the tracking server is further arranged to trigger storing of the still images by the image server in response to the predetermined event.
19. The system of
the location server is further arranged to determine and store second location information associated with a second location device carried by a second object, the second location information corresponding to one or more of the coverage areas; and
the tracking server is further arranged to determine which of the still images correspond to the second location information, and to retrieve the still images.
20. The system of
wherein the location server is further arranged to determine the distance of the wireless location device from one or more base stations based upon a signal strength of a remote device in relation to each base station.
21. The system of
wherein the base stations provide signals to the wireless location device, the wireless location device determines location information based upon the signals received from the base stations, and the location server receives location information from the wireless location device.
|
The present disclosure relates generally to tracking and surveillance of an object.
Fixed video and stills cameras may be used for monitoring an object such as a person within given coverage or monitored area. Examples include security cameras mounted in and about buildings which are monitored by security guards. Other cameras may incorporate controllable movement so that a security guard may track an object, for example to follow a person of interest moving about a building. Various surveillance systems are available to detect an object, for example by detecting movement or scene changes. The object may then be tracked or otherwise monitored using fixed and/or moveable cameras.
In particular embodiments a method of tracking an object carrying a wireless location device is provided. The method comprises recording and storing images from a plurality of cameras corresponding to respective coverage areas having predetermined locations, and determining location information associated with the wireless location device, the location information corresponding to one or more of the coverage areas. The method further comprises determining which of the images correspond to the location information, and retrieving these images.
In particular embodiments a system for tracking an object carrying a wireless location device is provided. The system comprises a plurality of cameras arranged to record images from respective coverage areas having predetermined locations, an image server coupled to the cameras and arranged to store the images recorded by the cameras, a location server arranged to determine and store location information associated with the wireless location device, the location information corresponding to one or more of said coverage areas, and a tracking server arranged to determine which of the images from the image server correspond to the location information, and to retrieve these images.
In particular embodiments a method of surveillance of an object is provided. The method comprises receiving a surveillance request comprising a surveillance time window, determining location information associated with the object over this surveillance time window, the location information corresponding to two or more coverage areas having predetermined locations, retrieving images of the two or more coverage areas which correspond to the location information over the surveillance time window, and displaying the images retrieved.
In particular embodiments a tracking server for tracking an object carrying a wireless location device is provided. The tracking server comprises a processor arranged to determine location information associated with the object over a surveillance time window in response to receiving a surveillance request comprising said surveillance time window. The location information corresponds to two or more coverage areas having predetermined locations. The processor is further arranged to retrieve images of the two or more coverage areas which correspond to the location information over the surveillance time window.
Embodiments are described with reference to the following drawings, by way of example only and without intending to be limiting, in which:
Referring to
The system 100 is used to track one or more objects 105a, 105b through the surveillance region. The objects may include a person (105a), a notebook computer (105b), an artwork, or any moveable object. Each object carries a respective wireless location device 110a, 110b, for example in a pocket of a person (105a) or integrated within a notebook computer (105b). The wireless location devices 110a, 110b may be radio frequency identity tags (RFID), or any wireless device such as a mobile phone which can be configured to communicate with the system in order to enable location information associated with the device to be determined.
The cameras 115x-115z periodically record images of their respective coverage areas 120x-120z which may or may not include an object 105a, 105b, and forward these recorded images together with a respective camera identifier (CamID) to the image server 150. For example each recorded image may be sent as an image file and associated camera identifier (170x, 170y, 170z). The cameras 115x-115z and image server 150 may be coupled using a local area network, coaxial cable or any other suitable mechanism as would be appreciated by those skilled in the art. The image server 150 timestamps the received image file and camera identifier (170x, 170y, 170z) using a suitable timestamp such as a time from a common clock (180) also used by the location server 155 or an internal clock sufficiently synchronized with a corresponding internal clock within the location server 155. The time-stamped image files and camera identifiers are then stored on the image server 150.
The base stations 125a-125c periodically determine location information associated with the wireless location devices 110a, 110b, for example by identifying near-by wireless location devices 110a, 110b and measuring the signal strength of signals received from these identified wireless location devices. The wireless location devices 110a, 110b are configured to periodically transmit their own unique device identifier (WLD_ID). The signal strength of this signal from the wireless location devices can then be measured by receiving base stations 125a-125c as will be appreciated by those skilled in the art. This signal strength measurement can then be used as a proxy for range or distance between the wireless location device 110a, 110b and the respective base station 125a-125c. If the wireless location device signal is picked up by a number of base stations 125a-125c, then the relative measured signal strengths from each base station can be used to determine the relative position of the wireless location devices 110a, 110b using triangulation as will also be appreciated by those skilled in the art. By knowing the locations of the base stations 125a-125c, the estimated positions of the wireless location devices 110a, 110b can then be estimated. Various system configurations will be available to those skilled in the art in order to coordinate the activities of the base stations 125a-125c and wireless location devices 110a, 110b, for example in order to ensure that the base stations are listening for the wireless location device signal transmissions at the right time. This may be achieved for example by arranging the base stations to periodically transmit a common beacon signal to which each of the wireless location devices 110a, 110b is configured to respond.
The base stations 125a-125c are typically located in and around the coverage areas 120x-120z so that each coverage area may be “observed” by at least three base stations 125a-125c. In other words, if an object (105a) and hence a respective wireless location device (110a) are located in a coverage area (120x), then at least three base stations (125a, 125b, 125c) would normally receive and be able to measure the signal strength of signals from that wireless location device (110a).
The base stations 125a-125c forward the wireless location device (110a, 110b) identifiers (WLD_ID) and their respective signal strength measures to the location server 155, together with their respective base station identifiers (BS_ID). This location information 175a-175c is received by the location server 155 and corresponds to one or more of the coverage areas 120x-120z. In other words, because the locations of the base stations 125a-125c are known and positioned around the coverage areas 120x-120z, the positions of the wireless location devices 110a, 110b can be estimated and “located”within or near-by one of the coverage areas 120x-120z. The location information 175a-175c can therefore include the wireless location device (110a, 110b) identifiers (WLD_ID), their respective signal strengths, and the base station identifier (BS_ID) of the base station 125a-125c that received the signal from the wireless location device 110a, 110b. Further location information may include the locations of the respective base stations 125a-125c, received signal angle-of-arrival information, received signal time-of-arrival information, global positioning satellite (GPS) co-ordinates from the wireless location devices 110a, 110b.
The base stations 125a-125c may be coupled to the location server 155 by a local area network (LAN) or any other suitable mechanism. A common LAN (not shown) may be used for coupling the base stations 125a-125c and location server 155, as well as the cameras 115x-115z and image server 150.
The location information received by the location server 155 may simply be time-stamped and stored, for example using the time-stamp functionality 180 used by the image server 150. Alternatively, the location server may further process this location information in order to determine further location information; for example by estimating a position for each wireless location device 110a, 110b. This position estimating may be implemented using the known locations of the base stations 125a-125c which received a signal from the respective wireless location devices 110a, 110b, together with the respective signal strengths of these signals. For example, taking the first object 105a in
The system 100 of this embodiment therefore provides time-stamped images of each coverage area 120x-120z as well as time-stamped location information for each object 105a, 105b, this location information corresponding to one or more of the coverage areas. This allows the tracking server 160 to track a selected object 105a through the coverage areas over time, and hence to retrieve images of that object. Thus given a surveillance time window, the tracking server 160 can determine from the location server the location information of the selected object 105a over that surveillance time window. This location information may simply comprise the coverage area 120x-120z in which the wireless location device 110a carried by the selected object 105a was located at each of a number of time intervals within the surveillance time window. Alternatively, this coverage area information may be determined from other location information stored within the location server 155, for example wireless location device 110a identifiers, corresponding signal strengths and associated base station locations. Once the coverage areas 120x-120z and the respective time intervals during which the wireless location device 110a was located in each coverage area are determined, images corresponding to those coverage areas 120x-120z at those time intervals can be requested from the image server 150. The sequence of coverage areas over the surveillance time window can then be displayed on the screen 165 in order to track the object 105a.
The system of this embodiment may be used for many applications, for example tracking a lost child in an amusement park or other crowded public area or tracking a notebook computer which has been removed from its last known position. More generally, embodiments may be used for security surveillance, inventory tracking in enterprises, and any application that requires video surveillance.
In alternative embodiments, the wireless location devices 110a, 110b may be arranged to simply forward their estimated coordinates to the location server 155, without the need for signal strength measuring at base stations having known locations. For example the wireless location devices 110a, 110b may incorporate GPS functionality and periodically forward their respective GPS coordinates to the location server 155 using a cellular network, or using WLAN base stations whose location is not required. In another example the wireless location devices 110a, 110b may estimate their locations using signals received from base stations having known locations, and forward this location information to the location server 155. In yet a further example, a base station may be positioned within each coverage areas 120x-120z such that when a wireless location device hands-off from one base station to another, it can be determined that the wireless location device has also moved from one coverage area to another—the locations of the base stations or their correspondence with the coverage areas being known.
In further alternative embodiments, the image server 150, location server 155, tracking server 160, screen 165, and time-stamp function 180 may be implemented in a single computer system, or distributed in any suitable manner as would be appreciated by those skilled in the art. Furthermore, the functionality implemented in the image server 150, location server 155, and tracking server 165 may be combined or distributed differently in other apparatus.
Referring now to
The image server (150) receives the recorded images and camera identifiers from a plurality of cameras (115x-115z) at step 220. Thus the image server 150 receives images of a plurality of fixed or known location coverage areas (120x-120z) over time. The image server (150) then timestamps these image files (and camera identifiers) at step 225. This step may be implemented using timestamp signals received from a time-stamping function (180) also used by the location server (155), however the time-stamping function does not require a high degree or tolerance given the speed of the objects (105a, 105b), typically people or objects carried by people, moving about within the coverage areas (120x-120z). The image server then stores the time-stamped image files and camera identifiers at step 230. Given the large size of image files, reduced resolution images or reduced frequency of recorded images may be used in order to reduce the storage requirements in some implementations. Similarly, images may only be stored when a wireless location device (110a, 110b) has been determined to be within the coverage area as will be described in more detail below.
Referring now to
The location server (155) then determines further location information associated with the wireless location devices (110a, 110b) which corresponds to one or more of the coverage areas (120x-120z) at step 320. For each wireless location device (110a), the location server (155) may identify a signal strength measurement and a corresponding base station location from the base station identifier, and estimate the position of the device (110a) using trilateration, triangulation or any other suitable locating method as would be appreciated by those skilled in the art. The estimated position will typically correspond to the predetermined locations of one of the coverage areas, in other words the estimated position is within one of the coverage areas. The location server then timestamps the determined location information (in this example the estimated position) at step 325. This step may be implemented using timestamp signals received from a time-stamping function (180) also used by the image server (150), however an internal clock will typically be adequate. The location server (155) then stores the determined location information at step 330. Whilst the determined location information has been described in this embodiment as an estimated position, or base station locations together with wireless location device signal strengths, the location information could simply be an identifier for the coverage area corresponding to the estimated position of the wireless location device.
Referring now to
The tracking server (160) receives this location information and determines which coverage areas (120x-120z) each location information corresponds to at each time interval at step 420. The correspondence between the location information and the coverage areas is available using the predetermined locations of the coverage areas (120x-120z). The tracking server (160) then requests images from the image server (150) which correspond to the determined coverage areas and respective time intervals at step 425. The requested times correspond to the timestamps used by the location server 155, and also in some embodiments by the image server (150). The image server (150) receives these requested coverage areas and respective time intervals from the tracking server (160) and returns the corresponding recorded and stored images at step 430. The image server may implement this step by matching the requested coverage areas with respective camera identifiers and search for image files having these camera identifiers and the requested time intervals. The tracking server (160) retrieves these images from the image server (150) at step 435. The retrieved image files are recorded images of the coverage areas corresponding to the location information of the identified object at each time interval over the surveillance time window. The tracking server (160) may arrange the received images into chronological order at step 440, for example using the timestamps associated with each image. The images of the coverage areas (120x-120z) traversed by the object (105a) are then displayed on the display screen by the tracking server at step 445. Thus the object (105a) can be tracked over the surveillance time window by viewing the images of the coverage areas showing the object. For example a lost child can be tracked or viewed as he or she moves around an amusement park to determine whether the child has just got lost or been abducted.
The tracking server 160 may additionally be arranged to display images from the coverage area in which an object is currently located. This may be implemented by interrogating the location server on the latest location information for the identified object and wireless location device, and requesting images from the image server of the coverage area corresponding to that location information. Indeed a direct feed from the camera 115x-115z associated with the coverage area may be displayed on the screen 165.
Referring now to
Whilst the embodiment has been described with respect to one object, it may be implemented with respect to many such objects, so that whenever the predetermined location of one of these objects changes, the storing of images is triggered.
In an example implementation, a notebook computer (105b) may have a normal or predetermined location which may or may not be within one of the coverage areas (120x-120y). When the notebook is removed from this predetermined location, the system (100) is configured by methods 500 and 200 to start recording images of the coverage areas in order to enable tracking of the notebook computer. Thus images of the notebook computer (105b) may be used to determine whether the notebook computer was legitimately moved by an authorized person, or has been stolen. If the notebook computer has been stolen, then the thief may be tracked on through the coverage areas, and perhaps their identity determined manually or by the public release of suitable images of the thief.
Another method of triggering the image server to start recording images in response to a predetermined event is shown in FIG. SB. This method 550 may be implemented by the tracking server 160 and the image server 150 of
Whilst the embodiment has been described with respect to one object, it may be implemented with respect to many such objects, so that whenever one of the objects is detected within one of the coverage areas, the storing of images is triggered. Alternatively, storing of images of each of the coverage areas may be triggered by independently by the detection of one of a number of objects within the respective coverage area. Such an arrangement reduces the storage space required for the image files, as only images of one or more predetermined objects are stored. Furthermore, in addition or alternatively, the camera or the respective coverage areas may be arranged to start recording in response to the trigger instructions from the tracking server. In a further arrangement, recording and/or storing of images for a coverage area may be stopped when no objects are detected within the coverage area.
In an example implementation, a person (105a) such as a child in an amusement arcade may receive an RFID tag on a wrist-band when entering. The storing of images from a particular camera may then be triggered upon detection of the child within a corresponding coverage area. In other words, location information associated with the RFID tag (110a) and recorded in the location server (155) is monitored to determine when it corresponds with a coverage area (120x). The image server (150) is then instructed to store images received from the camera (115x) corresponding to the coverage area (120x) which the child (105a) has just entered. Storing of images of the child in different coverage areas may then be triggered as the child enters these areas. Similarly storing of images from other coverage areas may also be triggered when different children enter them. Thus even though there is not continuous image recording of all coverage areas, there is continuous image recording of all objects.
Referring now to
The skilled person will recognise that the above-described apparatus and methods may be embodied as processor control code, for example on a carrier medium such as a disk, CD- or DVD-ROM, programmed memory such as read only memory (Firmware), or on a data carrier such as an optical or electrical signal carrier. For some applications embodiments of the invention may be implemented on a DSP (Digital Signal Processor), ASIC (Application Specific Integrated Circuit) or FPGA (Field Programmable Gate Array). Thus the code may comprise conventional programme code or microcode or, for example code for setting up or controlling an ASIC or FPGA. The code may also comprise code for dynamically configuring re-configurable apparatus such as re-programmable logic gate arrays. Similarly the code may comprise code for a hardware description language such as Verilog™ or VHDL (Very high speed integrated circuit Hardware Description Language). As the skilled person will appreciate, the code may be distributed between a plurality of coupled components in communication with one another. Where appropriate, the embodiments may also be implemented using code running on a field-(re)programmable analogue array or similar device in order to configure analogue hardware.
The skilled person will also appreciate that the various embodiments and specific features described with respect to them could be freely combined with the other embodiments or their specifically described features in general accordance with the above teaching. The skilled person will also recognise that various alterations and modifications can be made to specific examples described without departing from the scope of the appended claims.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.
Ganesan, Karthikeyan, Variyath, Girish S., Jayaraman, Vikram
Patent | Priority | Assignee | Title |
10185965, | Sep 27 2013 | PANASONIC INTELLECTUAL PROPERTY MANAGEMENT CO , LTD | Stay duration measurement method and system for measuring moving objects in a surveillance area |
10204496, | Dec 11 2008 | AT&T Intellectual Property I, L.P. | Method and apparatus for vehicle surveillance service in municipal environments |
10292005, | Jan 09 2015 | Object location tracking using mobile communication device | |
10361800, | Nov 18 2015 | PB, Inc | Radiobeacon data sharing by forwarding low energy transmissions to a cloud host |
10389459, | Nov 18 2015 | PB, Inc | Radiobeacon data sharing by forwarding low energy transmissions to a cloud host |
10424189, | Jun 10 2014 | PB, Inc. | Tracking device programs, systems and methods |
10431187, | Jun 29 2015 | Ricoh Company, Ltd.; Ricoh Company, LTD | Terminal apparatus, screen recording method, program, and information processing system |
10580281, | Jun 10 2014 | PB, Inc. | Tracking device system |
10659735, | Sep 11 2017 | HANWHA VISION CO , LTD | Surveillance system and method of operating surveillance system |
10765873, | Apr 09 2010 | ZOLL Medical Corporation | Systems and methods for EMS device communications interface |
10937286, | Jun 10 2014 | PB Inc. | Radiobeacon data sharing by forwarding low energy transmissions to a cloud host |
10979862, | Jun 10 2014 | PB Inc. | Tracking device system |
11076818, | Aug 15 2017 | SIEMENS HEALTHINEERS AG | Method for operating an x-ray device with an articulated arm, and x-ray device with an articulated arm |
11109816, | Jul 21 2009 | ZOLL Medical Corporation | Systems and methods for EMS device communications interface |
11145183, | Jun 10 2014 | PB INC | Tracking device programs, systems and methods |
11184858, | Sep 18 2018 | PB INC | Bluecell devices and methods |
11403924, | Jun 10 2014 | PB, Inc | Radiobeacon data sharing by forwarding low energy transmissions to a cloud host |
11589187, | Sep 13 2019 | TROVERLO, INC | Passive sensor tracking using observations of Wi-Fi access points |
11622234, | Sep 13 2019 | TROVERLO, INC | Passive asset tracking using observations of Wi-Fi access points |
11678141, | Sep 18 2018 | Hybrid cellular Bluetooth tracking devices, methods and systems | |
11792605, | Jun 10 2014 | PB INC | Tracking device systems |
11825382, | Aug 02 2019 | Tile, Inc. | Tracking device presence detection and reporting by access points |
11917488, | Sep 13 2019 | TROVERLO, INC | Passive asset tracking using observations of pseudo Wi-Fi access points |
11950170, | Sep 13 2019 | TROVERLO, INC. | Passive sensor tracking using observations of Wi-Fi access points |
12089116, | Aug 02 2019 | Tile, Inc. | Tracking device presence detection and reporting by access points |
8963916, | Aug 26 2011 | ReinCloud Corporation | Coherent presentation of multiple reality and interaction models |
9274595, | Aug 26 2011 | ReinCloud Corporation | Coherent presentation of multiple reality and interaction models |
9306714, | Jul 04 2011 | NOKIA SOLUTIONS AND NETWORKS OY | Method and apparatuses for configuring a communication channel |
9911166, | Sep 28 2012 | ZOLL Medical Corporation | Systems and methods for three-dimensional interaction monitoring in an EMS environment |
9998656, | Jul 21 2014 | Robert Bosch GmbH | Monitoring system comprising a reflecting device and correspondence module and method for setting up the monitoring system |
Patent | Priority | Assignee | Title |
2911462, | |||
3793489, | |||
3909121, | |||
4494144, | Jun 28 1982 | AT&T Bell Laboratories | Reduced bandwidth video transmission |
4750123, | Aug 30 1985 | Texas Instruments Incorporated | Method for predicting tracking cameras for free-roaming mobile robots |
4815132, | Aug 30 1985 | Kabushiki Kaisha Toshiba | Stereophonic voice signal transmission system |
4853764, | Sep 16 1988 | Pedalo, Inc. | Method and apparatus for screenless panoramic stereo TV system |
4961211, | Jun 30 1987 | NEC Corporation | Television conference system including many television monitors and method for controlling the same |
5020098, | Nov 03 1989 | AT&T Bell Laboratories | Telephone conferencing arrangement |
5136652, | Nov 14 1985 | TAIWAN SEMICONDUCTOR MANUFACTURING CO , LTD | Amplitude enhanced sampled clipped speech encoder and decoder |
5187571, | Feb 01 1991 | TTI Inventions A LLC | Television system for displaying multiple views of a remote location |
5200818, | Mar 22 1991 | Video imaging system with interactive windowing capability | |
5249035, | Nov 26 1990 | Kabushiki Kaisha Toshiba | Method of measuring three dimensional shape |
5255211, | Feb 22 1990 | Redmond Productions, Inc.; REDMOND PRODUCTIONS, INC | Methods and apparatus for generating and processing synthetic and absolute real time environments |
5268734, | May 31 1990 | GVBB HOLDINGS S A R L | Remote tracking system for moving picture cameras and method |
5317405, | Mar 08 1991 | Nippon Telegraph and Telephone Corporation | Display and image capture apparatus which enables eye contact |
5337363, | Nov 02 1992 | 3DO COMPANY, THE | Method for generating three dimensional sound |
5347363, | Jul 25 1991 | Kabushiki Kaisha Toshiba | External lead shape measurement apparatus for measuring lead shape of semiconductor package by using stereoscopic vision |
5406326, | Aug 06 1992 | Mediapod LLC | Video system for producing video image simulating the appearance of motion picture or other photographic film |
5423554, | Sep 24 1993 | CCG METAMEDIA, INC ; DOWTRONE PRESS KG, LLC | Virtual reality game method and apparatus |
5446834, | Apr 28 1992 | Sun Microsystems, Inc. | Method and apparatus for high resolution virtual reality systems using head tracked display |
5448287, | May 03 1993 | Spatial video display system | |
5467401, | Oct 13 1992 | MATSUSHITA ELECTRIC INDUSTRIAL CO , LTD | Sound environment simulator using a computer simulation and a method of analyzing a sound space |
5495576, | Jan 11 1993 | INTELLECTUAL VENTURS FUND 59 LLC; INTELLECTUAL VENTURES FUND 59 LLC | Panoramic image based virtual reality/telepresence audio-visual system and method |
5502481, | Nov 16 1992 | Reveo, Inc | Desktop-based projection display system for stereoscopic viewing of displayed imagery over a wide field of view |
5532737, | May 03 1993 | Regents of the University of California, The | Camera arrangement with wide field of view |
5541639, | Oct 23 1992 | Hitachi, LTD | Video conference system automatically started at reserved time |
5541773, | Mar 26 1993 | Olympus Optical Co., Ltd. | Two-unit zoom lens system |
5625410, | Apr 21 1993 | HAWK TECHNOLOGY SYSTEMS, LLC | Video monitoring and conferencing system |
5666153, | Oct 03 1995 | TALLARD B V | Retractable teleconferencing apparatus |
5675374, | Nov 26 1993 | Fujitsu Limited | Video teleconferencing system |
5729471, | Mar 31 1995 | The Regents of the University of California | Machine dynamic selection of one video camera/image of a scene from multiple video cameras/images of the scene in accordance with a particular perspective on the scene, an object in the scene, or an event in the scene |
5748121, | Dec 06 1995 | Intel Corporation | Generation of huffman tables for signal encoding |
5760826, | May 10 1996 | TRUSTEES OF COLUMBIA UNIVERSITY IN THE CITY OF NEW YORK, THE | Omnidirectional imaging apparatus |
5790182, | Aug 05 1996 | Vulcan Patents LLC | System and method for panoramic imaging using concentric spherical mirrors |
5815196, | Dec 29 1995 | Alcatel Lucent | Videophone with continuous speech-to-subtitles translation |
5940118, | Dec 22 1997 | RPX CLEARINGHOUSE LLC | System and method for steering directional microphones |
5940530, | Jul 21 1994 | Matsushita Electric Industrial Co., Ltd. | Backlit scene and people scene detecting method and apparatus and a gradation correction apparatus |
5956100, | Aug 17 1998 | Background light shield for a video display | |
6101113, | Dec 02 1999 | MAGNETIC TECHNOLOGIES, LLC | Transformers for multipulse AC/DC converters |
6124896, | Feb 20 1997 | Sony Corporation | Corner detection device and corner detection method |
6148092, | Jan 08 1998 | Sharp Kabushiki Kaisha | System for detecting skin-tone regions within an image |
6167162, | Oct 23 1998 | WSOU Investments, LLC | Rate-distortion optimized coding mode selection for video coders |
6226035, | Mar 04 1998 | Remotereality Corporation | Adjustable imaging system with wide angle capability |
6249318, | Sep 12 1997 | VID SCALE, INC | Video coding/decoding arrangement and method therefor |
6266082, | Dec 19 1995 | Canon Kabushiki Kaisha | Communication apparatus image processing apparatus communication method and image processing method |
6285392, | Nov 30 1998 | NEC Corporation | Multi-site television conference system and central control apparatus and conference terminal for use with the system |
6424377, | Jun 24 1996 | CEDAR LANE TECHNOLOGIES INC | Panoramic camera |
6493032, | Jun 24 1996 | CEDAR LANE TECHNOLOGIES INC | Imaging arrangement which allows for capturing an image of a view at different resolutions |
6583808, | Oct 04 2001 | National Research Council of Canada | Method and system for stereo videoconferencing |
6593956, | May 15 1998 | Polycom, Inc | Locating an audio source |
6680856, | Mar 22 2001 | Semikron Elektronik GmbH | Power converter circuit arrangement for generators with dynamically variable power output |
6704048, | Aug 27 1998 | Polycom, Inc | Adaptive electronic zoom control |
6751106, | Jul 25 2002 | General Electric Company | Cross current control for power converter systems and integrated magnetic choke assembly |
6819354, | Jun 13 2000 | OmniVision Technologies, Inc | Completely integrated helmet camera |
6917271, | Jul 25 2002 | General Electric Company | Cross current control for power converter systems and integrated magnetic choke assembly |
6963653, | Oct 22 2003 | The Research Foundation for The State University of New York | High-order directional microphone diaphragm |
6980526, | Mar 24 2000 | TELECONFERENCE SYSTEMS LLC | Multiple subscriber videoconferencing system |
6990086, | Jan 26 2001 | Cisco Technology, Inc | Method and system for label edge routing in a wireless network |
7002973, | Dec 11 2000 | ACME PACKET, INC | System and method for assisting in controlling real-time transport protocol flow through multiple networks via use of a cluster of session routers |
7028092, | Dec 11 2000 | ACME PACKET, INC | System and method for assisting in controlling real-time transport protocol flow through multiple networks via media flow routing |
7031311, | Jul 23 2001 | ACME PACKET, INC | System and method for providing rapid rerouting of real-time multi-media flows |
7057662, | Nov 22 2002 | HEWLETT-PACKARD DEVELOPMENT COMPANY, L P | Retractable camera apparatus |
7061896, | Sep 20 2000 | RPX Corporation | Wireless label switched packet transfer network |
7080157, | Jan 11 1999 | GOOGLE LLC | Performing multicast communication in computer networks by using overlay routing |
7111045, | Jun 22 2000 | Canon Kabushiki Kaisha | Image distribution system, and image distribution method and program therefor |
7136651, | Aug 30 2004 | RIBBON COMMUNICATIONS OPERATING COMPANY, INC | Mobile services control platform providing a converged voice service |
7158674, | Dec 27 2001 | LG Electronics Inc. | Scene change detection apparatus |
7161942, | Jan 31 2002 | TELCORDIA LEGACY INC | Method for distributing and conditioning traffic for mobile networks based on differentiated services |
7246118, | Jul 06 2001 | International Business Machines Corporation | Method and system for automated collaboration using electronic book highlights and notations |
7336299, | Jul 03 2003 | MERCURY MISSION SYSTEMS, LLC | Panoramic video system with real-time distortion-free imaging |
7353279, | Jan 08 2004 | U S BANK NATIONAL ASSOCIATION | Proxy architecture for providing quality of service(QoS) reservations |
7359731, | Sep 09 2004 | Nextel Communications Inc. | Architecture to facilitate interoperability and inter-working of push to talk technologies |
7411975, | Aug 26 2004 | Juniper Networks, Inc | Multimedia over internet protocol border controller for network-based virtual private networks |
7428000, | Jun 26 2003 | Microsoft Technology Licensing, LLC | System and method for distributed meetings |
7471320, | Aug 27 1998 | Polycom, Inc. | Electronic pan tilt zoom video camera with adaptive edge sharpening filter |
7477657, | May 08 2002 | Juniper Networks, Inc | Aggregating end-to-end QoS signaled packet flows through label switched paths |
7518051, | Aug 19 2005 | EJAMMING, INC | Method and apparatus for remote real time collaborative music performance and recording thereof |
7545761, | Jun 08 2005 | Cellco Partnership | Session classification for differentiated prepaid accounting |
7616226, | Apr 22 2004 | Sound View Innovations, LLC | Video conference system and a method for providing an individual perspective view for a participant of a video conference between multiple participants |
7990422, | Jul 19 2004 | GRANDEYE, LTD | Automatically expanding the zoom capability of a wide-angle video camera |
20020108125, | |||
20020140804, | |||
20020149672, | |||
20020186528, | |||
20030048218, | |||
20030072460, | |||
20030160861, | |||
20040003411, | |||
20040061787, | |||
20040091232, | |||
20040119814, | |||
20040164858, | |||
20040178955, | |||
20040246962, | |||
20040254982, | |||
20040260796, | |||
20050007954, | |||
20050024484, | |||
20050081160, | |||
20050110867, | |||
20050117022, | |||
20050147257, | |||
20050248652, | |||
20050268823, | |||
20060017807, | |||
20060028983, | |||
20060056056, | |||
20060066717, | |||
20060082643, | |||
20060120307, | |||
20060120568, | |||
20060125691, | |||
20060152489, | |||
20060152575, | |||
20060182436, | |||
20060256187, | |||
20060274157, | |||
20060284786, | |||
20070039030, | |||
20070040928, | |||
20070052856, | |||
20070109411, | |||
20070121353, | |||
20070140337, | |||
20070182818, | |||
20070206556, | |||
20070217406, | |||
20070217500, | |||
20070222865, | |||
20070247470, | |||
20070250567, | |||
20070250620, | |||
20080077390, | |||
20080167078, | |||
20080208444, | |||
20080240237, | |||
20080240571, | |||
20090009593, | |||
20090122867, | |||
20090207233, | |||
20090207234, | |||
20090244257, | |||
20090256901, | |||
20090279476, | |||
20090324023, | |||
20100123770, | |||
20100171808, | |||
20100208078, | |||
20100225732, | |||
20100283829, | |||
CN101953158, | |||
CN102067593, | |||
D533525, | Oct 21 2004 | Sony Corporation | Combined monitor/television receiver and camera |
D533852, | May 27 2005 | Hannspree, Inc. | Television set |
D534511, | Nov 25 2004 | PHC HOLDINGS CORPORATION | Combined television receiver with digital video disc player and video tape recorder |
D535954, | Sep 02 2004 | LG Electronics Inc | Television |
D539243, | Dec 28 2004 | GM Global Technology Operations, Inc | Television |
D541773, | Jan 09 2006 | Inventec Corporation | Internet protocol LCD TV |
D542247, | Jun 24 2005 | Sony Corporation | Combined television receiver and disc player |
D550635, | Jan 04 2006 | Microsoft Corporation | Monitor |
D551184, | Jan 24 2005 | Victor Company of Japan, Limited | Television receiver |
D555610, | Jul 15 2005 | SAMSUNG ELECTRONICS CO , LTD | PDP TV receiver |
D567202, | Jul 31 2007 | Shenzhen TCL New Technology co., LTD. | LCD TV |
D578496, | Nov 05 2007 | Dell Products L.P. | Information handling system |
D588560, | Apr 28 2006 | Cisco Technology, Inc | Endpoint for a videoconference |
D602453, | Mar 27 2009 | Dell Products L.P. | Display device |
D610560, | Apr 01 2009 | Hannspree, Inc. | Display |
D615514, | Aug 20 2009 | Cisco Technology, Inc | Single monitor and stand |
D626102, | Mar 21 2010 | Cisco Technology, Inc | Video unit with integrated features |
D626103, | Mar 21 2010 | Cisco Technology, Inc. | Video unit with integrated features |
D628175, | Mar 21 2010 | Cisco Technology, Inc. | Mounted video unit |
D628968, | Mar 21 2010 | Cisco Technology, Inc. | Free-standing video unit |
EP650299, | |||
EP714081, | |||
EP740177, | |||
EP1178352, | |||
EP1589758, | |||
EP1701308, | |||
EP1768058, | |||
GB2294605, | |||
GB2355876, | |||
WO2005013001, | |||
WO2005031001, | |||
WO2007106157, | |||
WO2007123946, | |||
WO2007123960, | |||
WO2008040258, | |||
WO2008101117, | |||
WO2008118887, | |||
WO2009102503, | |||
WO2009120814, | |||
WO2010059481, | |||
WO2010096342, | |||
WO2010104765, | |||
WO2010132271, | |||
WO9416517, | |||
WO9621321, | |||
WO9708896, | |||
WO9847291, | |||
WO9959026, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Jun 06 2007 | VARIYATH, GIRISH S | Cisco Technology, Inc | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 019418 | /0046 | |
Jun 06 2007 | GANESAN, KARTHIKEYAN NMI | Cisco Technology, Inc | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 019418 | /0046 | |
Jun 06 2007 | JAYARAMAN, VIKRAM NMI | Cisco Technology, Inc | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 019418 | /0046 | |
Jun 08 2007 | Cisco Technology, Inc. | (assignment on the face of the patent) | / |
Date | Maintenance Fee Events |
May 01 2017 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Apr 29 2021 | M1552: Payment of Maintenance Fee, 8th Year, Large Entity. |
Date | Maintenance Schedule |
Oct 29 2016 | 4 years fee payment window open |
Apr 29 2017 | 6 months grace period start (w surcharge) |
Oct 29 2017 | patent expiry (for year 4) |
Oct 29 2019 | 2 years to revive unintentionally abandoned end. (for year 4) |
Oct 29 2020 | 8 years fee payment window open |
Apr 29 2021 | 6 months grace period start (w surcharge) |
Oct 29 2021 | patent expiry (for year 8) |
Oct 29 2023 | 2 years to revive unintentionally abandoned end. (for year 8) |
Oct 29 2024 | 12 years fee payment window open |
Apr 29 2025 | 6 months grace period start (w surcharge) |
Oct 29 2025 | patent expiry (for year 12) |
Oct 29 2027 | 2 years to revive unintentionally abandoned end. (for year 12) |