In one embodiment, a method of tracking an object carrying a wireless location device comprises recording and storing images from a plurality of cameras corresponding to respective coverage areas having predetermined locations, determining location information associated with the wireless location device, the location information corresponding to one or more of said coverage areas, and determining which of the images correspond to the location information, and retrieving said images.

Patent
   8570373
Priority
Jun 08 2007
Filed
Jun 08 2007
Issued
Oct 29 2013
Expiry
Feb 28 2031
Extension
1361 days
Assg.orig
Entity
Large
30
201
window open
1. A method comprising:
recording and storing still images on an image server from a plurality of still cameras corresponding to respective coverage areas having predetermined locations, wherein the still cameras record the still images at periodic intervals, wherein recording and storing the still images is automatically initiated in response to a predetermined event such that no recording and storing of the still images occurs prior to the predetermined event;
determining, storing and timestamping location information associated with a wireless location device carried by an object, the location information corresponding to one or more of the coverage areas, wherein the location information is determined based at least upon a determined distance of the wireless location device from one or more base stations;
in response to a surveillance request comprising a surveillance time window associated with the object that includes a time period prior to a current time period, determining which timestamped location information associated with the wireless location device carried by the object is within the surveillance time window;
determining which of the still images stored on the image server correspond to the location information associated with the surveillance time window; and
retrieving still images from the image server that are associated with the surveillance time window and that depend upon the location information associated with the object in order to track the object.
15. A system comprising:
a plurality of still cameras arranged to record still images from respective coverage areas having predetermined locations, wherein the still cameras record the still images at periodic intervals;
an image server coupled to the still cameras and arranged to store the still images recorded by the still cameras, wherein the image server is further arranged to automatically initiate recording of the still images in response to a predetermined event such that no recording of the still images occurs prior to the predetermined event;
a plurality of wireless base stations having respective predetermined locations and operable to communicate wirelessly with a wireless location device carried by an object;
a location server arranged to determine, store and timestamp location information associated with a wireless location device carried by the object, the location information corresponding to one or more of the coverage areas, wherein the location information is determined based at least upon a determined distance of the wireless location device from one or more base stations; and
a tracking server arranged to receive a surveillance request comprising a surveillance time window associated with the object that includes a time period prior to a current time period, to determine which of the still images from the image server correspond to the location information based upon timestamps associated with the location information, and to retrieve the still images from the image server that are associated with the surveillance time window and depend upon the location information associated with the object and with timestamps within the surveillance window in order to track the object.
2. The method of claim 1,
wherein the still images are stored together with respective timestamps that correspond with timestamps of the location information.
3. The method of claim 2,
wherein determining which of the still images correspond to the location information comprises matching the predetermined locations of the coverage areas with the location information associated with the wireless location device and respective time-stamps associated with the still images and location information.
4. The method of claim 1,
wherein determining the location information comprises receiving wireless transmissions from the wireless location device.
5. The method of claim 1,
wherein the one or more base stations have predetermined locations and receive wireless transmissions from the wireless location device.
6. The method of claim 4,
wherein the location information comprises one or more of estimated position of the wireless location device, coverage area identifier, signal strength of wireless transmissions received from the wireless location device, wireless location device identifier, base station identifier, and GPS coordinates of the wireless location device.
7. The method of claim 1, the predetermined event being a change in the location information associated with the object and which corresponds to a predetermined change in location of the object.
8. The method of claim 1,
the predetermined event being the determination that the object is located within one of the coverage areas.
9. The method of claim 1, further comprising:
determining second location information associated with a second location device carried by a second object, the second location information corresponding to one or more of the coverage areas; and
determining which of the still images correspond to the second location information, and retrieving the still images.
10. The method of claim 1,
wherein the determined distance of the wireless location device from one or more base stations is based upon a signal strength of a remote device in relation to each base station.
11. The method of claim 1, further comprising:
obtaining location information at the wireless location device based upon signals received by the wireless location device from the one or more base stations.
12. The method of claim 1,
wherein the recording and storing still images on the image server occurs independently in relation to the surveillance request, the determining which still images stored on the image server correspond to the location information associated with the surveillance time window and the retrieving of still images from the image server that are associated with the surveillance time window, such that at least some recorded and stored still images on the image server are not associated with the surveillance request associated with the object and are thus not retrieved from the image server.
13. The method of claim 1,
wherein the object is lost, and the surveillance request comprises a request to find the lost object.
14. The method of claim 1,
wherein the object comprises one of a person and a computer.
16. The system of claim 15,
wherein the location information comprise one or more of estimated position of the wireless location device, coverage area identifier, signal strength of wireless transmissions received from the wireless location device, wireless location device identifier, base station identifier, and GPS coordinates of the wireless location device.
17. The system of claim 15,
wherein the still images stored on the location server are each associated with a respective timestamp, and the timestamps associated with the still images correspond with timestamps associated with the location information.
18. The system of claim 15,
wherein the tracking server is further arranged to trigger storing of the still images by the image server in response to the predetermined event.
19. The system of claim 15, wherein:
the location server is further arranged to determine and store second location information associated with a second location device carried by a second object, the second location information corresponding to one or more of the coverage areas; and
the tracking server is further arranged to determine which of the still images correspond to the second location information, and to retrieve the still images.
20. The system of claim 15,
wherein the location server is further arranged to determine the distance of the wireless location device from one or more base stations based upon a signal strength of a remote device in relation to each base station.
21. The system of claim 15,
wherein the base stations provide signals to the wireless location device, the wireless location device determines location information based upon the signals received from the base stations, and the location server receives location information from the wireless location device.

The present disclosure relates generally to tracking and surveillance of an object.

Fixed video and stills cameras may be used for monitoring an object such as a person within given coverage or monitored area. Examples include security cameras mounted in and about buildings which are monitored by security guards. Other cameras may incorporate controllable movement so that a security guard may track an object, for example to follow a person of interest moving about a building. Various surveillance systems are available to detect an object, for example by detecting movement or scene changes. The object may then be tracked or otherwise monitored using fixed and/or moveable cameras.

In particular embodiments a method of tracking an object carrying a wireless location device is provided. The method comprises recording and storing images from a plurality of cameras corresponding to respective coverage areas having predetermined locations, and determining location information associated with the wireless location device, the location information corresponding to one or more of the coverage areas. The method further comprises determining which of the images correspond to the location information, and retrieving these images.

In particular embodiments a system for tracking an object carrying a wireless location device is provided. The system comprises a plurality of cameras arranged to record images from respective coverage areas having predetermined locations, an image server coupled to the cameras and arranged to store the images recorded by the cameras, a location server arranged to determine and store location information associated with the wireless location device, the location information corresponding to one or more of said coverage areas, and a tracking server arranged to determine which of the images from the image server correspond to the location information, and to retrieve these images.

In particular embodiments a method of surveillance of an object is provided. The method comprises receiving a surveillance request comprising a surveillance time window, determining location information associated with the object over this surveillance time window, the location information corresponding to two or more coverage areas having predetermined locations, retrieving images of the two or more coverage areas which correspond to the location information over the surveillance time window, and displaying the images retrieved.

In particular embodiments a tracking server for tracking an object carrying a wireless location device is provided. The tracking server comprises a processor arranged to determine location information associated with the object over a surveillance time window in response to receiving a surveillance request comprising said surveillance time window. The location information corresponds to two or more coverage areas having predetermined locations. The processor is further arranged to retrieve images of the two or more coverage areas which correspond to the location information over the surveillance time window.

Embodiments are described with reference to the following drawings, by way of example only and without intending to be limiting, in which:

FIG. 1 illustrates an example system for tracking an object;

FIG. 2 illustrates an example method of recording and storing images;

FIG. 3 illustrates an example method of determining location information associated with the object;

FIG. 4 illustrates an example method of retrieving surveillance images of the object;

FIG. 5A illustrates an example method of triggering storing of images of the object;

FIG. 5B illustrates another example method of triggering storing of images of the object; and

FIG. 6 illustrates an example tracking server for use with the system of FIG. 1.

Referring to FIG. 1, a system for tracking an object according to an example embodiment is shown. The system 100 comprises a plurality of cameras 115x, 115y, 115z each recording images of respective coverage areas 120x, 120y, 120z. The coverage areas 120x-120z may be any suitable size or sizes, each coverage area has a fixed known or predetermined location, and they may or may not overlap each other. Together, the coverage areas form a surveillance region. The cameras 115x-115z may be video cameras, stills cameras periodically recording images, webcams or any other suitable image recording device. The system 100 also comprises a plurality of base stations 125a, 125b, 125c which each have a respective predetermined or known location. The system further comprises an image server 150 coupled to the cameras 115x-115z, a location server 155 coupled to the base stations 125a-125c, a tracking server 160 coupled to the image server 150 and the location server 155, and to a display screen 165. A timestamp generator 180 is also shown separately being coupled to the image server 150 and location server 155 for simplicity of explanation, however this functionality may be implemented within the image server 150 and location server 155 using internal clocks and processing and storage techniques that would be appreciated by those skilled in the art.

The system 100 is used to track one or more objects 105a, 105b through the surveillance region. The objects may include a person (105a), a notebook computer (105b), an artwork, or any moveable object. Each object carries a respective wireless location device 110a, 110b, for example in a pocket of a person (105a) or integrated within a notebook computer (105b). The wireless location devices 110a, 110b may be radio frequency identity tags (RFID), or any wireless device such as a mobile phone which can be configured to communicate with the system in order to enable location information associated with the device to be determined.

The cameras 115x-115z periodically record images of their respective coverage areas 120x-120z which may or may not include an object 105a, 105b, and forward these recorded images together with a respective camera identifier (CamID) to the image server 150. For example each recorded image may be sent as an image file and associated camera identifier (170x, 170y, 170z). The cameras 115x-115z and image server 150 may be coupled using a local area network, coaxial cable or any other suitable mechanism as would be appreciated by those skilled in the art. The image server 150 timestamps the received image file and camera identifier (170x, 170y, 170z) using a suitable timestamp such as a time from a common clock (180) also used by the location server 155 or an internal clock sufficiently synchronized with a corresponding internal clock within the location server 155. The time-stamped image files and camera identifiers are then stored on the image server 150.

FIG. 1 shows a first object 105a in coverage area 120x at the same time as a second object 105b in coverage area 120y, images of both these objects 105a, 105b being recorded in the image server 150 together with their correspondence to a particular coverage area 120x, 120y. In this embodiment, correspondence to a particular coverage area is implemented using a respective camera identifier.

The base stations 125a-125c periodically determine location information associated with the wireless location devices 110a, 110b, for example by identifying near-by wireless location devices 110a, 110b and measuring the signal strength of signals received from these identified wireless location devices. The wireless location devices 110a, 110b are configured to periodically transmit their own unique device identifier (WLD_ID). The signal strength of this signal from the wireless location devices can then be measured by receiving base stations 125a-125c as will be appreciated by those skilled in the art. This signal strength measurement can then be used as a proxy for range or distance between the wireless location device 110a, 110b and the respective base station 125a-125c. If the wireless location device signal is picked up by a number of base stations 125a-125c, then the relative measured signal strengths from each base station can be used to determine the relative position of the wireless location devices 110a, 110b using triangulation as will also be appreciated by those skilled in the art. By knowing the locations of the base stations 125a-125c, the estimated positions of the wireless location devices 110a, 110b can then be estimated. Various system configurations will be available to those skilled in the art in order to coordinate the activities of the base stations 125a-125c and wireless location devices 110a, 110b, for example in order to ensure that the base stations are listening for the wireless location device signal transmissions at the right time. This may be achieved for example by arranging the base stations to periodically transmit a common beacon signal to which each of the wireless location devices 110a, 110b is configured to respond.

The base stations 125a-125c are typically located in and around the coverage areas 120x-120z so that each coverage area may be “observed” by at least three base stations 125a-125c. In other words, if an object (105a) and hence a respective wireless location device (110a) are located in a coverage area (120x), then at least three base stations (125a, 125b, 125c) would normally receive and be able to measure the signal strength of signals from that wireless location device (110a).

The base stations 125a-125c forward the wireless location device (110a, 110b) identifiers (WLD_ID) and their respective signal strength measures to the location server 155, together with their respective base station identifiers (BS_ID). This location information 175a-175c is received by the location server 155 and corresponds to one or more of the coverage areas 120x-120z. In other words, because the locations of the base stations 125a-125c are known and positioned around the coverage areas 120x-120z, the positions of the wireless location devices 110a, 110b can be estimated and “located”within or near-by one of the coverage areas 120x-120z. The location information 175a-175c can therefore include the wireless location device (110a, 110b) identifiers (WLD_ID), their respective signal strengths, and the base station identifier (BS_ID) of the base station 125a-125c that received the signal from the wireless location device 110a, 110b. Further location information may include the locations of the respective base stations 125a-125c, received signal angle-of-arrival information, received signal time-of-arrival information, global positioning satellite (GPS) co-ordinates from the wireless location devices 110a, 110b.

The base stations 125a-125c may be coupled to the location server 155 by a local area network (LAN) or any other suitable mechanism. A common LAN (not shown) may be used for coupling the base stations 125a-125c and location server 155, as well as the cameras 115x-115z and image server 150.

The location information received by the location server 155 may simply be time-stamped and stored, for example using the time-stamp functionality 180 used by the image server 150. Alternatively, the location server may further process this location information in order to determine further location information; for example by estimating a position for each wireless location device 110a, 110b. This position estimating may be implemented using the known locations of the base stations 125a-125c which received a signal from the respective wireless location devices 110a, 110b, together with the respective signal strengths of these signals. For example, taking the first object 105a in FIG. 1, the wireless location device 110a is located within coverage area 120x. Signals from this wireless location device 110a are received by base stations 125a, 125b, 125c, each at varying signal strengths dependent on distance. Base station 125a forwards location information 175a to the location server comprising the base station's identifier (BS_ID) together with the wireless location device identifier (WLD_ID) received from the wireless location device 110a, and the signal strength at which this received signal was measured. Similar location information 175b, 175c is also received by the location server 155 from base stations 125b, 125c. This location information 175a-175c may also include the identifiers and respective signal strengths from other wireless location devices 110b. The location server 155 then uses triangulation to determine a relative position for the wireless location device 110a, that is relative to the three base stations 125a-125c. Knowing the locations of these base stations 125a-125c, the location server 155 can then determine an actual or estimated position for the wireless location device 110a and which is located in or otherwise corresponds to one of the coverage areas. The location server 155 may then associate the wireless location device 110a with the corresponding coverage area (120x) and timestamp and store this location information. Or the estimated location may be time-stamped together with the wireless location device's identifier and stored. Thus the location server 155 is able to locate each of the wireless location devices 110, 110b, over time, and hence the objects 105a, 105b carrying them.

The system 100 of this embodiment therefore provides time-stamped images of each coverage area 120x-120z as well as time-stamped location information for each object 105a, 105b, this location information corresponding to one or more of the coverage areas. This allows the tracking server 160 to track a selected object 105a through the coverage areas over time, and hence to retrieve images of that object. Thus given a surveillance time window, the tracking server 160 can determine from the location server the location information of the selected object 105a over that surveillance time window. This location information may simply comprise the coverage area 120x-120z in which the wireless location device 110a carried by the selected object 105a was located at each of a number of time intervals within the surveillance time window. Alternatively, this coverage area information may be determined from other location information stored within the location server 155, for example wireless location device 110a identifiers, corresponding signal strengths and associated base station locations. Once the coverage areas 120x-120z and the respective time intervals during which the wireless location device 110a was located in each coverage area are determined, images corresponding to those coverage areas 120x-120z at those time intervals can be requested from the image server 150. The sequence of coverage areas over the surveillance time window can then be displayed on the screen 165 in order to track the object 105a.

The system of this embodiment may be used for many applications, for example tracking a lost child in an amusement park or other crowded public area or tracking a notebook computer which has been removed from its last known position. More generally, embodiments may be used for security surveillance, inventory tracking in enterprises, and any application that requires video surveillance.

In alternative embodiments, the wireless location devices 110a, 110b may be arranged to simply forward their estimated coordinates to the location server 155, without the need for signal strength measuring at base stations having known locations. For example the wireless location devices 110a, 110b may incorporate GPS functionality and periodically forward their respective GPS coordinates to the location server 155 using a cellular network, or using WLAN base stations whose location is not required. In another example the wireless location devices 110a, 110b may estimate their locations using signals received from base stations having known locations, and forward this location information to the location server 155. In yet a further example, a base station may be positioned within each coverage areas 120x-120z such that when a wireless location device hands-off from one base station to another, it can be determined that the wireless location device has also moved from one coverage area to another—the locations of the base stations or their correspondence with the coverage areas being known.

In further alternative embodiments, the image server 150, location server 155, tracking server 160, screen 165, and time-stamp function 180 may be implemented in a single computer system, or distributed in any suitable manner as would be appreciated by those skilled in the art. Furthermore, the functionality implemented in the image server 150, location server 155, and tracking server 165 may be combined or distributed differently in other apparatus.

Referring now to FIG. 2, a method of recording and storing images from a plurality of cameras corresponding to respective coverage areas is shown. This method 200 may be implemented by the cameras 115x-115z and image server 150 of FIG. 1, however it should be understood that the method is not limited to being performed on these apparatus. Additionally, whilst the method indicates a particular order of steps, in other embodiments the steps could be ordered differently, and/or the steps could be modified or removed. The cameras (115x-115z) and image server (150) may be continuously recording and storing images, however in order to save storage space this recording and/or storing may only be started in response to predetermined events or other triggers as indicated at step 205. This step is described in more detail later, and for the moment, it is assumed that the cameras and image server are continuously recording and storing images. Thus each camera (115x-115z) records images of a respective coverage area (120x-120z) at step 210. The coverage areas have predetermined locations, and the images may be recorded periodically, for example every second. Once an image has been recorded, each camera (115x-115z) forwards the image together with a camera identifier to the image server (150) at step 215. The recorded image may be forwarded as any suitable image file such as any of the available JPEG (Joint Photographic Experts Group) or MPEG (Moving Pictures Expert Group) standards. The camera identifier can be any suitable identifier which is unique within the system (100).

The image server (150) receives the recorded images and camera identifiers from a plurality of cameras (115x-115z) at step 220. Thus the image server 150 receives images of a plurality of fixed or known location coverage areas (120x-120z) over time. The image server (150) then timestamps these image files (and camera identifiers) at step 225. This step may be implemented using timestamp signals received from a time-stamping function (180) also used by the location server (155), however the time-stamping function does not require a high degree or tolerance given the speed of the objects (105a, 105b), typically people or objects carried by people, moving about within the coverage areas (120x-120z). The image server then stores the time-stamped image files and camera identifiers at step 230. Given the large size of image files, reduced resolution images or reduced frequency of recorded images may be used in order to reduce the storage requirements in some implementations. Similarly, images may only be stored when a wireless location device (110a, 110b) has been determined to be within the coverage area as will be described in more detail below.

Referring now to FIG. 3, a method of determining location information associated with wireless location devices and corresponding to one or more of the coverage areas is shown. This method 300 may be implemented by the wireless location devices 110a, 110b, base stations 125a-125c and location server 155 of FIG. 1, however it should be understood that the method is not limited to being performed on these apparatus. Additionally, whilst the method indicates a particular order of steps, in other embodiments the steps could be ordered differently, and/or the steps could be modified or removed. Each base station (125a-125c) periodically determines location information in the form of wireless location device identifiers and corresponding signal strength measurements at step 305. Each base station may receive signals from a number of wireless location devices (110a, 110b), each signal carrying the identifier for the respective device. The signal strength for each received signal can be measured in various ways, for example using the RSSI (received signal strength indication) parameter. Thus each base station may determine a number of device identifiers and respective signal strength measurements periodically, for example once every second. Each base station forwards any wireless device identifiers and signal strength measurements it has determined to the location server at step 310. This forwarded location information (175a-175c) also includes an identifier for the base station (125a-125c) which uniquely identifies the base station within the system (100). The location server (155) receives this location information (wireless location device identifier(s) and respective signal strength measurements, base station identifier(s)) at step 315.

The location server (155) then determines further location information associated with the wireless location devices (110a, 110b) which corresponds to one or more of the coverage areas (120x-120z) at step 320. For each wireless location device (110a), the location server (155) may identify a signal strength measurement and a corresponding base station location from the base station identifier, and estimate the position of the device (110a) using trilateration, triangulation or any other suitable locating method as would be appreciated by those skilled in the art. The estimated position will typically correspond to the predetermined locations of one of the coverage areas, in other words the estimated position is within one of the coverage areas. The location server then timestamps the determined location information (in this example the estimated position) at step 325. This step may be implemented using timestamp signals received from a time-stamping function (180) also used by the image server (150), however an internal clock will typically be adequate. The location server (155) then stores the determined location information at step 330. Whilst the determined location information has been described in this embodiment as an estimated position, or base station locations together with wireless location device signal strengths, the location information could simply be an identifier for the coverage area corresponding to the estimated position of the wireless location device.

Referring now to FIG. 4, a method of surveillance of an object is shown. This method 400 may be implemented by the tracking server 160, location server 155, and image server 150 of FIG. 1, however it should be understood that the method is not limited to being performed on these apparatus. Additionally, whilst the method indicates a particular order of steps, in other embodiments the steps could be ordered differently, and/or the steps could be modified or removed. The tracking server (160) initially receives a surveillance request at step 405. The surveillance request includes an object identifier and a surveillance time window. The object identifier corresponds to or is the same as one of the identifiers of the wireless location devices (110a, 110b) which uniquely identifies each wireless location device within the system. The surveillance time window is simply a duration having a start time and an end time over which the identified or selected object is to be tracked. The tracking server (160) then determines location information associated with wireless location device (110a) associated with the identified object (105a) over the surveillance time window by requesting this location information from the location server (155) at step 410. The location server (155) receives this request from the tracking server (160), and returns the location information for the identified wireless location device (110a) over the requested times to the tracking server at step 415. In the example embodiment, this location information as determined by the method (300) of FIG. 3 is the estimated location of the wireless location device (110a) for every second of the surveillance time window; however other time intervals could alternatively be used

The tracking server (160) receives this location information and determines which coverage areas (120x-120z) each location information corresponds to at each time interval at step 420. The correspondence between the location information and the coverage areas is available using the predetermined locations of the coverage areas (120x-120z). The tracking server (160) then requests images from the image server (150) which correspond to the determined coverage areas and respective time intervals at step 425. The requested times correspond to the timestamps used by the location server 155, and also in some embodiments by the image server (150). The image server (150) receives these requested coverage areas and respective time intervals from the tracking server (160) and returns the corresponding recorded and stored images at step 430. The image server may implement this step by matching the requested coverage areas with respective camera identifiers and search for image files having these camera identifiers and the requested time intervals. The tracking server (160) retrieves these images from the image server (150) at step 435. The retrieved image files are recorded images of the coverage areas corresponding to the location information of the identified object at each time interval over the surveillance time window. The tracking server (160) may arrange the received images into chronological order at step 440, for example using the timestamps associated with each image. The images of the coverage areas (120x-120z) traversed by the object (105a) are then displayed on the display screen by the tracking server at step 445. Thus the object (105a) can be tracked over the surveillance time window by viewing the images of the coverage areas showing the object. For example a lost child can be tracked or viewed as he or she moves around an amusement park to determine whether the child has just got lost or been abducted.

The tracking server 160 may additionally be arranged to display images from the coverage area in which an object is currently located. This may be implemented by interrogating the location server on the latest location information for the identified object and wireless location device, and requesting images from the image server of the coverage area corresponding to that location information. Indeed a direct feed from the camera 115x-115z associated with the coverage area may be displayed on the screen 165.

Referring now to FIG. 5A, a method of triggering the image server to start recording images in response to a predetermined event is shown. This method 500 may be implemented by the tracking server 160 and the image server 150 of FIG. 1, however it should be understood that the method is not limited to being performed on these apparatus. Additionally, whilst the method indicates a particular order of steps, in other embodiments the steps could be ordered differently, and/or the steps could be modified or removed. The tracking server (160) monitors location information associated with an object at step 505. This step may be implemented by periodically requesting the latest location information for a wireless location device (110a) associated with an identified object (105a). This step may also involve the processing of the location information, for example to calculate an estimated position for the object if this has not already been done, and/or to determine the coverage area which the object is currently located in. The tracking server (160) then determines whether there has been a change in location of the object at step 510. This predetermined event, the change in object location, may correspond to a change in coverage area, as determined from the latest location information. If no change in location has been detected (510N), then the tracking server returns to monitoring the location information of the object. However if a location change is detected (510Y), then the tracking server instructs the image server (150) to start storing images at step 515. The instruction to start storing images may only relate to the overage area to which the object has moved, or it may relate to all coverage areas. The instruction is received by the image server (150) at step 205 of method 200 illustrated in FIG. 2. The image server (150) then proceeds to implement the rest of method 200.

Whilst the embodiment has been described with respect to one object, it may be implemented with respect to many such objects, so that whenever the predetermined location of one of these objects changes, the storing of images is triggered.

In an example implementation, a notebook computer (105b) may have a normal or predetermined location which may or may not be within one of the coverage areas (120x-120y). When the notebook is removed from this predetermined location, the system (100) is configured by methods 500 and 200 to start recording images of the coverage areas in order to enable tracking of the notebook computer. Thus images of the notebook computer (105b) may be used to determine whether the notebook computer was legitimately moved by an authorized person, or has been stolen. If the notebook computer has been stolen, then the thief may be tracked on through the coverage areas, and perhaps their identity determined manually or by the public release of suitable images of the thief.

Another method of triggering the image server to start recording images in response to a predetermined event is shown in FIG. SB. This method 550 may be implemented by the tracking server 160 and the image server 150 of FIG. 1, however it should be understood that the method is not limited to being performed on these apparatus. Additionally, whilst the method indicates a particular order of steps, in other embodiments the steps could be ordered differently, and/or the steps could be modified or removed. The tracking server (160) monitors location information associated with an object at step 555. This step is similar to step 505 from method 500, and may be implemented by periodically requesting the latest location information for a wireless location device (110a) associated with an identified object (105a). This step may also involve the processing of the location information, for example to calculate an estimated position for the object if this has not already been done, and/or to determine the coverage area which the object is currently located in. The tracking server (160) then determines whether location information has been determined at step 510. This predetermined event, the start of location information, may correspond to a wireless location device (110a) moving into range of a base station (125a-125c) or moving into a coverage area. It may be that the location server (155) has no location information on the identified object (and hence wireless location device 110a) until this time, or that the location of the object does not correspond with a coverage area (120x-120z) until this time. If no location information is received on the wireless location device (and hence object 105a) or the location information does not correspond with a coverage area, (560N), then the tracking server returns to monitoring the location information of the object. However if location information (or its correspondence with a coverage area) is detected (560Y), then the tracking server instructs the image server (150) to start storing images at step 565. As with step 515 from method 500, the instruction to start storing images may only relate to the coverage area to which the object has moved into, or it may relate to all coverage areas. The instruction is received by the image server (150) at step 205 of method 200 illustrated in FIG. 2. The image server (150) then proceeds to implement the rest of method 200.

Whilst the embodiment has been described with respect to one object, it may be implemented with respect to many such objects, so that whenever one of the objects is detected within one of the coverage areas, the storing of images is triggered. Alternatively, storing of images of each of the coverage areas may be triggered by independently by the detection of one of a number of objects within the respective coverage area. Such an arrangement reduces the storage space required for the image files, as only images of one or more predetermined objects are stored. Furthermore, in addition or alternatively, the camera or the respective coverage areas may be arranged to start recording in response to the trigger instructions from the tracking server. In a further arrangement, recording and/or storing of images for a coverage area may be stopped when no objects are detected within the coverage area.

In an example implementation, a person (105a) such as a child in an amusement arcade may receive an RFID tag on a wrist-band when entering. The storing of images from a particular camera may then be triggered upon detection of the child within a corresponding coverage area. In other words, location information associated with the RFID tag (110a) and recorded in the location server (155) is monitored to determine when it corresponds with a coverage area (120x). The image server (150) is then instructed to store images received from the camera (115x) corresponding to the coverage area (120x) which the child (105a) has just entered. Storing of images of the child in different coverage areas may then be triggered as the child enters these areas. Similarly storing of images from other coverage areas may also be triggered when different children enter them. Thus even though there is not continuous image recording of all coverage areas, there is continuous image recording of all objects.

Referring now to FIG. 6, a more detailed schematic of an example tracking server is shown. This tracking server 160 may implement the method 400 of FIG. 4, the method 500 of FIG. SA, and/or the method 550 of FIG. 5B. However it should be understood that the tracking server is not limited to performing these methods and may perform surveillance and/or image recording triggering methods according to other embodiments. Additionally, whilst the tracking server 160 of FIG. 6 indicates a particular arrangement of component parts, in other embodiments the component parts could be arranged differently, and/or the component parts could be modified or replaced. The tracking server 160 comprises a processor 605, a working memory 610 such as RAM, storage memory 615 such as a hard-disk drive, and a user interface 620 for coupling to user interface devices. These user interface devices include a keyboard 625 from which the user interface 620 receives instructions and/or data from a user of the tracking server 160, as well as a display screen 165 to which the user interface 620 forwards instructions, images and/or data to the user. Thus a user may enter a surveillance time window and an object identifier using the keyboard, and view images of the identified object over the surveillance time window on the display screen. The tracking server 160 may be implemented on a suitably configured personal computer (PC) for example, or other arrangements of computing apparatus.

The skilled person will recognise that the above-described apparatus and methods may be embodied as processor control code, for example on a carrier medium such as a disk, CD- or DVD-ROM, programmed memory such as read only memory (Firmware), or on a data carrier such as an optical or electrical signal carrier. For some applications embodiments of the invention may be implemented on a DSP (Digital Signal Processor), ASIC (Application Specific Integrated Circuit) or FPGA (Field Programmable Gate Array). Thus the code may comprise conventional programme code or microcode or, for example code for setting up or controlling an ASIC or FPGA. The code may also comprise code for dynamically configuring re-configurable apparatus such as re-programmable logic gate arrays. Similarly the code may comprise code for a hardware description language such as Verilog™ or VHDL (Very high speed integrated circuit Hardware Description Language). As the skilled person will appreciate, the code may be distributed between a plurality of coupled components in communication with one another. Where appropriate, the embodiments may also be implemented using code running on a field-(re)programmable analogue array or similar device in order to configure analogue hardware.

The skilled person will also appreciate that the various embodiments and specific features described with respect to them could be freely combined with the other embodiments or their specifically described features in general accordance with the above teaching. The skilled person will also recognise that various alterations and modifications can be made to specific examples described without departing from the scope of the appended claims.

Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.

Ganesan, Karthikeyan, Variyath, Girish S., Jayaraman, Vikram

Patent Priority Assignee Title
10185965, Sep 27 2013 PANASONIC INTELLECTUAL PROPERTY MANAGEMENT CO , LTD Stay duration measurement method and system for measuring moving objects in a surveillance area
10204496, Dec 11 2008 AT&T Intellectual Property I, L.P. Method and apparatus for vehicle surveillance service in municipal environments
10292005, Jan 09 2015 Object location tracking using mobile communication device
10361800, Nov 18 2015 PB, Inc Radiobeacon data sharing by forwarding low energy transmissions to a cloud host
10389459, Nov 18 2015 PB, Inc Radiobeacon data sharing by forwarding low energy transmissions to a cloud host
10424189, Jun 10 2014 PB, Inc. Tracking device programs, systems and methods
10431187, Jun 29 2015 Ricoh Company, Ltd.; Ricoh Company, LTD Terminal apparatus, screen recording method, program, and information processing system
10580281, Jun 10 2014 PB, Inc. Tracking device system
10659735, Sep 11 2017 HANWHA VISION CO , LTD Surveillance system and method of operating surveillance system
10765873, Apr 09 2010 ZOLL Medical Corporation Systems and methods for EMS device communications interface
10937286, Jun 10 2014 PB Inc. Radiobeacon data sharing by forwarding low energy transmissions to a cloud host
10979862, Jun 10 2014 PB Inc. Tracking device system
11076818, Aug 15 2017 SIEMENS HEALTHINEERS AG Method for operating an x-ray device with an articulated arm, and x-ray device with an articulated arm
11109816, Jul 21 2009 ZOLL Medical Corporation Systems and methods for EMS device communications interface
11145183, Jun 10 2014 PB INC Tracking device programs, systems and methods
11184858, Sep 18 2018 PB INC Bluecell devices and methods
11403924, Jun 10 2014 PB, Inc Radiobeacon data sharing by forwarding low energy transmissions to a cloud host
11589187, Sep 13 2019 TROVERLO, INC Passive sensor tracking using observations of Wi-Fi access points
11622234, Sep 13 2019 TROVERLO, INC Passive asset tracking using observations of Wi-Fi access points
11678141, Sep 18 2018 Hybrid cellular Bluetooth tracking devices, methods and systems
11792605, Jun 10 2014 PB INC Tracking device systems
11825382, Aug 02 2019 Tile, Inc. Tracking device presence detection and reporting by access points
11917488, Sep 13 2019 TROVERLO, INC Passive asset tracking using observations of pseudo Wi-Fi access points
11950170, Sep 13 2019 TROVERLO, INC. Passive sensor tracking using observations of Wi-Fi access points
12089116, Aug 02 2019 Tile, Inc. Tracking device presence detection and reporting by access points
8963916, Aug 26 2011 ReinCloud Corporation Coherent presentation of multiple reality and interaction models
9274595, Aug 26 2011 ReinCloud Corporation Coherent presentation of multiple reality and interaction models
9306714, Jul 04 2011 NOKIA SOLUTIONS AND NETWORKS OY Method and apparatuses for configuring a communication channel
9911166, Sep 28 2012 ZOLL Medical Corporation Systems and methods for three-dimensional interaction monitoring in an EMS environment
9998656, Jul 21 2014 Robert Bosch GmbH Monitoring system comprising a reflecting device and correspondence module and method for setting up the monitoring system
Patent Priority Assignee Title
2911462,
3793489,
3909121,
4494144, Jun 28 1982 AT&T Bell Laboratories Reduced bandwidth video transmission
4750123, Aug 30 1985 Texas Instruments Incorporated Method for predicting tracking cameras for free-roaming mobile robots
4815132, Aug 30 1985 Kabushiki Kaisha Toshiba Stereophonic voice signal transmission system
4853764, Sep 16 1988 Pedalo, Inc. Method and apparatus for screenless panoramic stereo TV system
4961211, Jun 30 1987 NEC Corporation Television conference system including many television monitors and method for controlling the same
5020098, Nov 03 1989 AT&T Bell Laboratories Telephone conferencing arrangement
5136652, Nov 14 1985 TAIWAN SEMICONDUCTOR MANUFACTURING CO , LTD Amplitude enhanced sampled clipped speech encoder and decoder
5187571, Feb 01 1991 TTI Inventions A LLC Television system for displaying multiple views of a remote location
5200818, Mar 22 1991 Video imaging system with interactive windowing capability
5249035, Nov 26 1990 Kabushiki Kaisha Toshiba Method of measuring three dimensional shape
5255211, Feb 22 1990 Redmond Productions, Inc.; REDMOND PRODUCTIONS, INC Methods and apparatus for generating and processing synthetic and absolute real time environments
5268734, May 31 1990 GVBB HOLDINGS S A R L Remote tracking system for moving picture cameras and method
5317405, Mar 08 1991 Nippon Telegraph and Telephone Corporation Display and image capture apparatus which enables eye contact
5337363, Nov 02 1992 3DO COMPANY, THE Method for generating three dimensional sound
5347363, Jul 25 1991 Kabushiki Kaisha Toshiba External lead shape measurement apparatus for measuring lead shape of semiconductor package by using stereoscopic vision
5406326, Aug 06 1992 Mediapod LLC Video system for producing video image simulating the appearance of motion picture or other photographic film
5423554, Sep 24 1993 CCG METAMEDIA, INC ; DOWTRONE PRESS KG, LLC Virtual reality game method and apparatus
5446834, Apr 28 1992 Sun Microsystems, Inc. Method and apparatus for high resolution virtual reality systems using head tracked display
5448287, May 03 1993 Spatial video display system
5467401, Oct 13 1992 MATSUSHITA ELECTRIC INDUSTRIAL CO , LTD Sound environment simulator using a computer simulation and a method of analyzing a sound space
5495576, Jan 11 1993 INTELLECTUAL VENTURS FUND 59 LLC; INTELLECTUAL VENTURES FUND 59 LLC Panoramic image based virtual reality/telepresence audio-visual system and method
5502481, Nov 16 1992 Reveo, Inc Desktop-based projection display system for stereoscopic viewing of displayed imagery over a wide field of view
5532737, May 03 1993 Regents of the University of California, The Camera arrangement with wide field of view
5541639, Oct 23 1992 Hitachi, LTD Video conference system automatically started at reserved time
5541773, Mar 26 1993 Olympus Optical Co., Ltd. Two-unit zoom lens system
5625410, Apr 21 1993 HAWK TECHNOLOGY SYSTEMS, LLC Video monitoring and conferencing system
5666153, Oct 03 1995 TALLARD B V Retractable teleconferencing apparatus
5675374, Nov 26 1993 Fujitsu Limited Video teleconferencing system
5729471, Mar 31 1995 The Regents of the University of California Machine dynamic selection of one video camera/image of a scene from multiple video cameras/images of the scene in accordance with a particular perspective on the scene, an object in the scene, or an event in the scene
5748121, Dec 06 1995 Intel Corporation Generation of huffman tables for signal encoding
5760826, May 10 1996 TRUSTEES OF COLUMBIA UNIVERSITY IN THE CITY OF NEW YORK, THE Omnidirectional imaging apparatus
5790182, Aug 05 1996 Vulcan Patents LLC System and method for panoramic imaging using concentric spherical mirrors
5815196, Dec 29 1995 Alcatel Lucent Videophone with continuous speech-to-subtitles translation
5940118, Dec 22 1997 RPX CLEARINGHOUSE LLC System and method for steering directional microphones
5940530, Jul 21 1994 Matsushita Electric Industrial Co., Ltd. Backlit scene and people scene detecting method and apparatus and a gradation correction apparatus
5956100, Aug 17 1998 Background light shield for a video display
6101113, Dec 02 1999 MAGNETIC TECHNOLOGIES, LLC Transformers for multipulse AC/DC converters
6124896, Feb 20 1997 Sony Corporation Corner detection device and corner detection method
6148092, Jan 08 1998 Sharp Kabushiki Kaisha System for detecting skin-tone regions within an image
6167162, Oct 23 1998 WSOU Investments, LLC Rate-distortion optimized coding mode selection for video coders
6226035, Mar 04 1998 Remotereality Corporation Adjustable imaging system with wide angle capability
6249318, Sep 12 1997 VID SCALE, INC Video coding/decoding arrangement and method therefor
6266082, Dec 19 1995 Canon Kabushiki Kaisha Communication apparatus image processing apparatus communication method and image processing method
6285392, Nov 30 1998 NEC Corporation Multi-site television conference system and central control apparatus and conference terminal for use with the system
6424377, Jun 24 1996 CEDAR LANE TECHNOLOGIES INC Panoramic camera
6493032, Jun 24 1996 CEDAR LANE TECHNOLOGIES INC Imaging arrangement which allows for capturing an image of a view at different resolutions
6583808, Oct 04 2001 National Research Council of Canada Method and system for stereo videoconferencing
6593956, May 15 1998 Polycom, Inc Locating an audio source
6680856, Mar 22 2001 Semikron Elektronik GmbH Power converter circuit arrangement for generators with dynamically variable power output
6704048, Aug 27 1998 Polycom, Inc Adaptive electronic zoom control
6751106, Jul 25 2002 General Electric Company Cross current control for power converter systems and integrated magnetic choke assembly
6819354, Jun 13 2000 OmniVision Technologies, Inc Completely integrated helmet camera
6917271, Jul 25 2002 General Electric Company Cross current control for power converter systems and integrated magnetic choke assembly
6963653, Oct 22 2003 The Research Foundation for The State University of New York High-order directional microphone diaphragm
6980526, Mar 24 2000 TELECONFERENCE SYSTEMS LLC Multiple subscriber videoconferencing system
6990086, Jan 26 2001 Cisco Technology, Inc Method and system for label edge routing in a wireless network
7002973, Dec 11 2000 ACME PACKET, INC System and method for assisting in controlling real-time transport protocol flow through multiple networks via use of a cluster of session routers
7028092, Dec 11 2000 ACME PACKET, INC System and method for assisting in controlling real-time transport protocol flow through multiple networks via media flow routing
7031311, Jul 23 2001 ACME PACKET, INC System and method for providing rapid rerouting of real-time multi-media flows
7057662, Nov 22 2002 HEWLETT-PACKARD DEVELOPMENT COMPANY, L P Retractable camera apparatus
7061896, Sep 20 2000 RPX Corporation Wireless label switched packet transfer network
7080157, Jan 11 1999 GOOGLE LLC Performing multicast communication in computer networks by using overlay routing
7111045, Jun 22 2000 Canon Kabushiki Kaisha Image distribution system, and image distribution method and program therefor
7136651, Aug 30 2004 RIBBON COMMUNICATIONS OPERATING COMPANY, INC Mobile services control platform providing a converged voice service
7158674, Dec 27 2001 LG Electronics Inc. Scene change detection apparatus
7161942, Jan 31 2002 TELCORDIA LEGACY INC Method for distributing and conditioning traffic for mobile networks based on differentiated services
7246118, Jul 06 2001 International Business Machines Corporation Method and system for automated collaboration using electronic book highlights and notations
7336299, Jul 03 2003 MERCURY MISSION SYSTEMS, LLC Panoramic video system with real-time distortion-free imaging
7353279, Jan 08 2004 U S BANK NATIONAL ASSOCIATION Proxy architecture for providing quality of service(QoS) reservations
7359731, Sep 09 2004 Nextel Communications Inc. Architecture to facilitate interoperability and inter-working of push to talk technologies
7411975, Aug 26 2004 Juniper Networks, Inc Multimedia over internet protocol border controller for network-based virtual private networks
7428000, Jun 26 2003 Microsoft Technology Licensing, LLC System and method for distributed meetings
7471320, Aug 27 1998 Polycom, Inc. Electronic pan tilt zoom video camera with adaptive edge sharpening filter
7477657, May 08 2002 Juniper Networks, Inc Aggregating end-to-end QoS signaled packet flows through label switched paths
7518051, Aug 19 2005 EJAMMING, INC Method and apparatus for remote real time collaborative music performance and recording thereof
7545761, Jun 08 2005 Cellco Partnership Session classification for differentiated prepaid accounting
7616226, Apr 22 2004 Sound View Innovations, LLC Video conference system and a method for providing an individual perspective view for a participant of a video conference between multiple participants
7990422, Jul 19 2004 GRANDEYE, LTD Automatically expanding the zoom capability of a wide-angle video camera
20020108125,
20020140804,
20020149672,
20020186528,
20030048218,
20030072460,
20030160861,
20040003411,
20040061787,
20040091232,
20040119814,
20040164858,
20040178955,
20040246962,
20040254982,
20040260796,
20050007954,
20050024484,
20050081160,
20050110867,
20050117022,
20050147257,
20050248652,
20050268823,
20060017807,
20060028983,
20060056056,
20060066717,
20060082643,
20060120307,
20060120568,
20060125691,
20060152489,
20060152575,
20060182436,
20060256187,
20060274157,
20060284786,
20070039030,
20070040928,
20070052856,
20070109411,
20070121353,
20070140337,
20070182818,
20070206556,
20070217406,
20070217500,
20070222865,
20070247470,
20070250567,
20070250620,
20080077390,
20080167078,
20080208444,
20080240237,
20080240571,
20090009593,
20090122867,
20090207233,
20090207234,
20090244257,
20090256901,
20090279476,
20090324023,
20100123770,
20100171808,
20100208078,
20100225732,
20100283829,
CN101953158,
CN102067593,
D533525, Oct 21 2004 Sony Corporation Combined monitor/television receiver and camera
D533852, May 27 2005 Hannspree, Inc. Television set
D534511, Nov 25 2004 PHC HOLDINGS CORPORATION Combined television receiver with digital video disc player and video tape recorder
D535954, Sep 02 2004 LG Electronics Inc Television
D539243, Dec 28 2004 GM Global Technology Operations, Inc Television
D541773, Jan 09 2006 Inventec Corporation Internet protocol LCD TV
D542247, Jun 24 2005 Sony Corporation Combined television receiver and disc player
D550635, Jan 04 2006 Microsoft Corporation Monitor
D551184, Jan 24 2005 Victor Company of Japan, Limited Television receiver
D555610, Jul 15 2005 SAMSUNG ELECTRONICS CO , LTD PDP TV receiver
D567202, Jul 31 2007 Shenzhen TCL New Technology co., LTD. LCD TV
D578496, Nov 05 2007 Dell Products L.P. Information handling system
D588560, Apr 28 2006 Cisco Technology, Inc Endpoint for a videoconference
D602453, Mar 27 2009 Dell Products L.P. Display device
D610560, Apr 01 2009 Hannspree, Inc. Display
D615514, Aug 20 2009 Cisco Technology, Inc Single monitor and stand
D626102, Mar 21 2010 Cisco Technology, Inc Video unit with integrated features
D626103, Mar 21 2010 Cisco Technology, Inc. Video unit with integrated features
D628175, Mar 21 2010 Cisco Technology, Inc. Mounted video unit
D628968, Mar 21 2010 Cisco Technology, Inc. Free-standing video unit
EP650299,
EP714081,
EP740177,
EP1178352,
EP1589758,
EP1701308,
EP1768058,
GB2294605,
GB2355876,
WO2005013001,
WO2005031001,
WO2007106157,
WO2007123946,
WO2007123960,
WO2008040258,
WO2008101117,
WO2008118887,
WO2009102503,
WO2009120814,
WO2010059481,
WO2010096342,
WO2010104765,
WO2010132271,
WO9416517,
WO9621321,
WO9708896,
WO9847291,
WO9959026,
////
Executed onAssignorAssigneeConveyanceFrameReelDoc
Jun 06 2007VARIYATH, GIRISH S Cisco Technology, IncASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0194180046 pdf
Jun 06 2007GANESAN, KARTHIKEYAN NMI Cisco Technology, IncASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0194180046 pdf
Jun 06 2007JAYARAMAN, VIKRAM NMI Cisco Technology, IncASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0194180046 pdf
Jun 08 2007Cisco Technology, Inc.(assignment on the face of the patent)
Date Maintenance Fee Events
May 01 2017M1551: Payment of Maintenance Fee, 4th Year, Large Entity.
Apr 29 2021M1552: Payment of Maintenance Fee, 8th Year, Large Entity.


Date Maintenance Schedule
Oct 29 20164 years fee payment window open
Apr 29 20176 months grace period start (w surcharge)
Oct 29 2017patent expiry (for year 4)
Oct 29 20192 years to revive unintentionally abandoned end. (for year 4)
Oct 29 20208 years fee payment window open
Apr 29 20216 months grace period start (w surcharge)
Oct 29 2021patent expiry (for year 8)
Oct 29 20232 years to revive unintentionally abandoned end. (for year 8)
Oct 29 202412 years fee payment window open
Apr 29 20256 months grace period start (w surcharge)
Oct 29 2025patent expiry (for year 12)
Oct 29 20272 years to revive unintentionally abandoned end. (for year 12)