A method for facilitating collection and recall of buried asset data on a mobile device on a network is provided. The mobile device transmits current position data of the mobile device to a server via the network, and receives from the server, a) buried asset data including depth data and electrical signal data for one or more buried assets, wherein the buried asset data corresponds to the current position data, and b) a stored image of a physical area corresponding to the current position data, wherein the stored image comprises markings indicating a position of the buried assets. The mobile device further displays the buried asset data from the server, and reads a current image of the physical area of the mobile device. The mobile computing device further displays the stored image overlaid onto the current image, and determines that the stored image is aligned with the current image.
|
8. A method for facilitating collection and recall of buried asset data on a mobile computing device communicatively coupled with a communications network, comprising:
transmitting current position data of the mobile computing device to a server via the communications network;
receiving, from the server via the communications network, a) buried asset data including depth data and electrical signal data for one or more buried assets, wherein the buried asset data corresponds to the current position data, and b) a stored image of a physical area corresponding to the current position data, wherein the stored image comprises one or more markings indicating a position of the one or more buried assets; displaying the buried asset data from the server;
reading a current image of the physical area from an optical sensor of the mobile computing device;
displaying the stored image simultaneously with the current image, wherein the stored image is overlaid onto the current image;
determining that the stored image is aligned within a margin of error to the current image;
receiving current buried asset data from one or more sensors communicatively coupled with the mobile computing device;
generating a data structure and auto-populating the data structure with the current buried asset data; and
transmitting the data structure and current image to the server via the communications network.
15. A method for facilitating collection and recall of buried asset data on a mobile computing device, comprising:
determining current position data of the mobile computing device;
accessing a) buried asset data including depth data and electrical signal data for one or more buried assets, wherein the buried asset data corresponds to the current position data, and b) a stored image of a physical area corresponding to the current position data, wherein the stored image comprises one or more markings indicating a position of the one or more buried assets;
displaying the buried asset data;
reading a current image of the physical area from an optical sensor of the mobile computing device;
displaying the stored image simultaneously with the current image, wherein the stored image is overlaid onto the current image; and
determining that the stored image is aligned within a margin of error to the current image;
responsive to determining that the stored image is aligned within a margin of error with the current image, receiving current buried asset data from one or more sensors communicatively coupled with the mobile computing device;
reading a second current image of the physical area from the optical sensor of the mobile computing device, wherein the second current image comprises one or more markings indicating a position of the one or more buried assets;
generating a data structure and auto-populating the data structure with the current buried asset data; and
storing the data structure and the second current image.
1. A method for facilitating collection and recall of buried asset data on a mobile computing device communicatively coupled with a communications network, comprising:
transmitting current position data of the mobile computing device to a server via the communications network;
receiving, from the server via the communications network, a) buried asset data including depth data and electrical signal data for one or more buried assets, wherein the buried asset data corresponds to the current position data, and b) a stored image of a physical area corresponding to the current position data, wherein the stored image comprises one or more markings indicating a position of the one or more buried assets; displaying the buried asset data from the server;
reading a current image of the physical area from an optical sensor of the mobile computing device;
displaying the stored image simultaneously with the current image, wherein the stored image is overlaid onto the current image;
determining that the stored image is aligned within a margin of error to the current image;
receiving current buried asset data from one or more sensors communicatively coupled with the mobile computing device;
reading a second current image of the physical area from the optical sensor of the mobile computing device, wherein the second current image comprises one or more markings indicating a position of the one or more buried assets;
generating a data structure and auto-populating the data structure with the current buried asset data; and
transmitting the data structure and the second current image to the server via the communications network.
2. The method of
receiving a radio frequency signal comprising a time the signal was transmitted and a location of a sender of the signal;
calculating current position data of the mobile computing device based on the signal; and
transmitting the current position data to a server via the communications network.
3. The method of
receiving, from the server via the communications network, a plurality of records, wherein each record includes: a) buried asset data including depth data and electrical signal data for one or more buried assets, wherein the buried asset data corresponds to the current position data, and b) a stored image of a physical area corresponding to the current position data, wherein the stored image comprises one or more markings indicating a position of the one or more buried assets;
reading a selection of one of the plurality of records from a user; and
transmitting the selection to the server via the communications network.
4. The method of
receiving, from the server via the communications network, a plurality of images of a physical area corresponding to the current position data;
reading a selection of one of the plurality of images from a user; and
transmitting the selection to the server via the communications network.
5. The method of
reading a command from a user indicating that the stored image is aligned within a margin of error with the current image.
6. The method of
executing an image processing routine for determining that the stored image is aligned within a margin of error with the current image.
7. The method of
comparing the buried asset data received from the server with the current buried asset data;
if the buried asset data received from the server is substantially equal to the current buried asset data, executing the steps of generating a data structure and auto-populating the data structure with the current buried asset data, and transmitting the data structure and the second current image to the server via the communications network.
9. The method of
receiving a radio frequency signal comprising a time the signal was transmitted and a location of a sender of the signal;
calculating current position data of the mobile computing device based on the signal; and
transmitting the current position data to a server via the communications network.
10. The method of
receiving, from the server via the communications network, a plurality of records, wherein each record includes: a) buried asset data including depth data and electrical signal data for one or more buried assets, wherein the buried asset data corresponds to the current position data, and b) a stored image of a physical area corresponding to the current position data, wherein the stored image comprises one or more markings indicating a position of the one or more buried assets;
reading a selection of one of the plurality of records from a user; and
transmitting the selection to the server via the communications network.
11. The method of
receiving, from the server via the communications network, a plurality of images of a physical area corresponding to the current position data;
reading a selection of one of the plurality of images from a user; and
transmitting the selection to the server via the communications network.
12. The method of
reading a command from a user indicating that the stored image is aligned within a margin of error with the current image.
13. The method of
executing an image processing routine for determining that the stored image is aligned within a margin of error with the current image.
14. The method of
comparing the buried asset data received from the server with the current buried asset data;
if the buried asset data received from the server is substantially equal to the current buried asset data, executing the steps of generating a data structure and auto-populating the data structure with the current buried asset data, and transmitting the data structure and the second current image to the server via the communications network.
16. The method of
receiving a radio frequency signal comprising a time the signal was transmitted and a location of a sender of the signal;
calculating current position data of the mobile computing device based on the signal.
17. The method of
accessing a plurality of records, wherein each record includes: a) buried asset data including depth data and electrical signal data for one or more buried assets, wherein the buried asset data corresponds to the current position data, and b) a stored image of a physical area corresponding to the current position data, wherein the stored image comprises one or more markings indicating a position of the one or more buried assets;
reading a selection of one of the plurality of records from a user; and
accessing the record corresponding to the selection from the user.
|
Not Applicable.
Not Applicable.
Not Applicable.
The technical field relates generally to the identification of buried assets (i.e., underground utility lines) and, more specifically, to processes for utilizing buried asset data over telecommunications networks.
Utility lines, such as lines for telephones, electricity distribution, natural gas, cable television, fiber optics, Internet, traffic lights, street lights, storm drains, water mains, and wastewater pipes, are often located underground. Utility lines are referred to as “buried assets” herein. Consequently, before excavation occurs in an area, especially an urban area, an excavator is typically required to clear excavation activities with the proper authorities. The clearance procedure usually includes contacting a central authority that in turn notifies the appropriate utility companies. Subsequently, each utility company must perform a buried asset detection procedure, which includes visiting the excavation site, detecting the relevant buried assets and physically marking the position of the buried asset using temporary paint or flags. Upon completion of this procedure by the appropriate utility companies, excavation can occur with the security that buried assets will not be damaged.
One of the problems that arise during buried asset detection is the amount of time spent detecting the buried asset. Usually, a technician visiting a proposed excavation site is not provided with any position data, or only vague position data as a starting point. Consequently, the technician must perform time consuming sensing procedures on a large swath of land to detect the buried asset. This increases the time and resources necessary to detect the buried asset. Another problem with conventional buried asset detection is the method by which buried asset data is conveyed. Typically, a technician marks the positions of buried assets using temporary paint and/or flags at the proposed excavation site. These surface markings, however, were designed to remain only for a short period of time. Consequently, after the surface markings have been erased or removed, buried asset detection must be performed again, if the need arises in the future. This is wasteful and redundant.
Therefore, a need exists for improvements over the prior art, and more particularly for methods and systems that expedite the buried asset detection process for excavation sites, while reducing waste and redundancy.
A method and system that facilitates collection and recall of buried asset data on a mobile computing device communicatively coupled with a communications network is provided. This Summary is provided to introduce a selection of disclosed concepts in a simplified form that are further described below in the Detailed Description including the drawings provided. This Summary is not intended to identify key features or essential features of the claimed subject matter. Nor is this Summary intended to be used to limit the claimed subject matter's scope.
In one embodiment, a method for facilitating collection and recall of buried asset data on a mobile computing device communicatively coupled with a communications network is provided that solves the above-described problems. The mobile computing device transmits current position data of the mobile computing device to a server via the communications network, and receives, from the server via the communications network, a) buried asset data including depth data and electrical signal data for one or more buried assets, wherein the buried asset data corresponds to the current position data, and b) a stored image of a physical area corresponding to the current position data, wherein the stored image comprises one or more markings indicating a position of the one or more buried assets. The mobile computing device further displays the buried asset data from the server, and reads a current image of the physical area from an optical sensor of the mobile computing device. The mobile computing device further displays the stored image overlaid onto the current image, and determines that the stored image is aligned within a margin of error with the current image.
The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate various example embodiments. In the drawings:
The following detailed description refers to the accompanying drawings. Wherever possible, the same reference numbers are used in the drawings and the following description to refer to the same or similar elements. While embodiments of the invention may be described, modifications, adaptations, and other implementations are possible. For example, substitutions, additions, or modifications may be made to the elements illustrated in the drawings, and the methods described herein may be modified by substituting, reordering, or adding stages to the disclosed methods. Accordingly, the following detailed description does not limit the invention. Instead, the proper scope of the invention is defined by the appended claims.
The present invention improves over the prior art by providing systems and methods that allow for the re-use of prior buried asset detection activities. The present invention allows technicians performing standard buried asset detection procedures on a proposed excavation site to view prior buried asset data about the same site on a mobile computing device. This data guides the technician to the position of buried assets, thereby reducing the amount of time and resources spent detecting the buried assets. The present invention further utilizes an image comparison method whereby a current image of the proposed excavation site is compared with a stored image of the proposed excavation site, wherein the stored image including markings and objects that indicate the position of buried assets. This image comparison method further aids the technician in performing buried asset detection procedures, which further reduces time spent detecting the buried asset. Lastly, the present invention allows the technician to upload the current buried asset data he has garnered at the proposed excavation site, which may be used in future buried asset detection activities at the same site.
A locator device 112 may be a conventional, off-the-shelf, utility locator that detects and identifies buried assets using radio frequency and/or magnetic sensors. As such, locator device 112 and mobile computing device 120 may each comprise a computing device 600, described below in greater detail with respect to
Server 102 includes a software engine that delivers applications, data, program code and other information to networked devices 120. The software engine of server 102 may perform other processes such as transferring multimedia data in a stream of packets that are interpreted and rendered by a software application as the packets arrive. It should be noted that although
Server 102, mobile computing device 120 and locator device 112 may each include program logic comprising computer source code, scripting language code or interpreted language code that perform various functions of the present invention. In one embodiment, the aforementioned program logic may comprise program module 607 in
Environment 100 may be used when a mobile computing device 120 and locator device 112 engage in buried asset detection activities that comprise storing and reading buried asset data to and from database 104 coupled to server 102. Various types of data may be stored in the database 104 of server 102. For example, the database 104 may store one or more records for each location, i.e., a location record. A location record may include location data, such as latitude and longitude coordinates, an altitude coordinate, a current time, a textual map address, or the like. A location record may also include a list of buried asset data, wherein each buried asset item may be defined by its depth, position, electrical signal measurement (such as current, resistance, impedance, magnitude, frequency, etc.) and orientation.
A location record may further include one or more images (i.e., photographs) of the physical area of the location. In one embodiment, an image of a physical area corresponding to a location comprises one or more surface markings indicating a position of the one or more buried assets. Markings may include colored or uncolored solid lines, dotted lines, circles, squares, flags, arrows, objects, text or other visual indicia in the image that indicate the actual location of a buried asset. A solid yellow line, for example, may be used in an image of a physical area corresponding to a location in order to indicate the presence of a buried asset in the actual location of the solid yellow line. Lastly, a location record may include other data, such as the name or unique identifier for the technician that created the location record, a time/date stamp indicating a creation and/or modification date of the location record, etc.
Note that although server 102 is shown as a single and independent entity, in one embodiment of the present invention, the functions of server 102 may be integrated with another entity, such as the mobile computing device 120 and the locator device 112. Further, server 102 and its functionality, according to a preferred embodiment of the present invention, can be realized in a centralized fashion in one computer system or in a distributed fashion wherein different elements are spread across several interconnected computer systems.
Method 300 starts with step 302 wherein the device 120 calculates its current position (e.g., current position data 202) and transmits it to the server 102. In one embodiment, the device 120 calculates its current position using a Global Positioning System (GPS) receiver, which is a navigation device that receives GPS signals for the purpose of determining the device's current position on Earth. A GPS receiver, and its accompanying processor, may calculate latitude, longitude and altitude information. In this embodiment, step 302 comprises receiving a radio frequency signal from a GPS transmitter (such as a satellite) comprising a time the signal was transmitted and a position of the transmitter, calculating current position data 202 of the device 120 based on the signal, and transmitting the current position data 202 to the server 102 via the communications network 106. In another embodiment, the device 120 calculates its current position using alternative services, such as control plan locating, GSM localization, dead reckoning, or any combination of the aforementioned position services. In yet another embodiment, the device 120 also calculates its current compass heading (such as via the use of a compass application) and transmits this data to the server 102.
Next, in step 304 the server 102 receives the current position data 202 (and any other data transmitted by device 120) and accesses any location records in its database 104 that correspond to the current position data 202 or within a margin of error of the current position data 202. If any such location records are found, in step 306 the server 102 transmits the one or more location records to the device 120 over the network 106. As explained above, each location record may include position data 212, buried data 214 and stored image data 216.
In step 308, the device 120 receives the location records. If device 120 receives only one location record, then the data in the location record is displayed for the technician 110 on a display of device 120 via, for example, one or more graphical user interfaces.
In one alternative to step 308, if device 120 receives multiple location records, then an abbreviated or truncated version of each location record is displayed for the technician 110 on a display of device 120 via, for example, one or more graphical user interfaces, so as to allow the technician to select one location record. For example, device 120 may display a list of stored images for each of the multiple location records, thereby allowing the technician to select one of the location records by clicking on one of the stored images. In another example, device 120 may display a drop down menu that displays a list of time/date stamps for each of the multiple location records, thereby allowing the technician to select one of the location records by clicking on one of the drop down menu items.
Next, in step 310 the device 112 reads a live, current or captured image of the physical area of the device 120 from an optical sensor of the device 120, such as a camera. In step 312, the device 120 displays the captured image simultaneously with the stored image 414, described above. Step 312 may be executed in a variety of ways. In one embodiment of step 312,
In one alternative, the device 120 may place arrows on the captured image 512 indicating to the technician how the optical sensor or camera of device 120 should be panned so that stored image 414 would be aligned with captured image 512. Using
In another embodiment of step 312,
Returning to the method of 300, in step 314, the device 112 determines that the stored image 414 has been aligned, within a margin of error, with the captured image 512. The purpose of step 314 is to aid the technician 110 in finding the position of buried assets at his current location. When the stored image 414 has been aligned with the captured image 512, the technician 110 is on notice that his device is looking, i.e., pointing at, the exact same place as shown in the stored image 414. Thus, step 314 ensures: a) that the position of the device 120 is the same, or nearly the same as, the position of the location record received by device 120 and b) that the physical area shown in captured image 512 is the same, or nearly the same as, the physical area shown in stored image 414. Step 314 may be executed in a variety of ways.
In one embodiment of step 314, the technician 110 pans the optical sensor or camera of device 120 so that the stored image 414 becomes aligned, within a margin of error, to the captured image 512. Subsequently, the technician 110 enters a command or presses a graphical user interface widget to indicate to device 120 that the stored image 414 has been aligned with the captured image 512. In another embodiment of step 314, the technician 110 pans the optical sensor or camera of device 120 until the device 120, using known image processing techniques (such as object recognition or pattern recognition), determines that the stored image 414 has been aligned, within a margin of error, to the captured image 512. In this embodiment, the device 120 may use known image processing techniques to indicate to the technician 110 in which direction to pan the camera. Subsequently, the device 120 stores an indicator and/or displays a message to technician 110 indicating that the stored image 414 has been aligned with the captured image 512.
Next, in step 316, the device 112 generates current buried asset data 204 based on data received from one or more sensors of the device 112, and transmits the buried asset data to the device 120. In one embodiment, the device 120 compares the current buried asset data 204 with the buried asset data 214 received from the server 102. In this embodiment, the control flow proceeds to steps 318 through 324 only if the current buried asset data 204 is identical to, or nearly identical to (i.e., substantially equal or within a margin of error) the buried asset data 214.
In step 318, the technician 110 may utilize the data it received from server 102 (such as 212, 214, 216), as well as the current data generated by devices 112 and 120 (such as 202, 204, 206), to place temporary physical markings at his current location, typically using paint and/or flags, to indicate the location of any buried assets. In step 320, the technician uses the device 120 to capture a new image or photograph of the current location, wherein the image includes the surface markings that have been placed by the technician. The new captured image is referred to as captured image data 206.
In step 322, the device 120 generates a data structure and auto-populates the data structure with the current buried asset data 204. A data structure may be an array, a record, a hash, a union, a set, an object or the like. In step 324, the device 120 transmits the data structure including the current buried asset data 204, and the capture image data 206, to the server 102 via the communications network 106. In one embodiment, the data transmitted to server 102 in step 324 may include other data, such as the name or unique identifier for the technician that created the data, a time/date stamp indicating a creation and/or modification date of the buried asset data 204 and/or captured image data 206, etc.
In step 326, the server 102 receives the data 204, 206 and in response, generates and stores, in database 104, one or more location records, in which is stored the current buried asset data 204, the captured image data 206, and the position data 202 previously provided by device 120 in step 302. In one alternative, step 326, the server 102 receives the data 204, 206 and in response, updates the one or more location records (in database 104) previously accessed by server 102 in step 304.
Note that the location records generated and stored in step 326 are stored in association with the one or more location records accessed by server 102 in step 304, since the location records generated and stored in step 320 correspond to the same position data 202 as the location records accessed by server 102 in step 304. Subsequently, when the server 102 seeks location records associated with position data 202 in the future, server 102 will access the same location records generated and stored in step 320.
In an alternative embodiment, the database 104 and the functionality of server 102 are both integrated into the device 120. In this alternative embodiment, the method 300 is executed as defined below.
In step 302, the device 120 calculates its current position and in step 304 the device 120 accesses any location records in its database that correspond to the current position data 202 or within a margin of error of the current position data 202. In this embodiment, step 306 is not executed. In step 308, the device 120 displays the data in the location record for the technician 110, such as in interfaces 402 and 404.
Next, in step 310 the device 112 reads a current or captured image of the physical area of the device 120 from an optical sensor of the device 120, such as a camera. In step 312, the device 120 displays the captured image simultaneously with the stored image 414, as described above. In step 314, the device 112 determines that the stored image 414 has been aligned, within a margin of error, with the captured image 512. In step 316, the device 112 generates current buried asset data 204 based on data received from one or more sensors of the device 112, and transmits the buried asset data to the device 120. In step 318, the technician 110 may utilize the data that was accessed in step 304 above, as well as the current data generated by devices 112 and 120 (such as 202, 204, 206), to place temporary physical markings at his current location. In step 320, the technician uses the device 120 to capture a image data 206 of the current location, wherein the image includes the markings that have been placed by the technician.
In step 322, the device 120 generates a data structure and auto-populates the data structure with the current buried asset data 204. In this embodiment, step 324 is not executed. In step 326, the device 120 generates and stores, in its database, one or more location records, in which is stored the current buried asset data 204, the captured image data 206, and the position data 202 previously determined by device 120 in step 302.
With reference to
Computing device 600 may have additional features or functionality. For example, computing device 600 may also include additional data storage devices (removable and/or non-removable) such as, for example, magnetic disks, optical disks, or tape. Such additional storage is illustrated in
Computing device 600 may also contain a communication connection 616 that may allow device 600 to communicate with other computing devices 618, such as over a network in a distributed computing environment, for example, an intranet or the Internet. Communication connection 616 is one example of communication media. Communication media may typically be embodied by computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and includes any information delivery media. The term “modulated data signal” may describe a signal that has one or more characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media may include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, radio frequency (RF), infrared, and other wireless media. The term computer readable media as used herein may include both computer storage media and communication media.
As stated above, a number of program modules and data files may be stored in system memory 604, including operating system 605. While executing on processing unit 602, programming modules 606 (e.g. program module 607) may perform processes including, for example, one or more of method 300's stages as described above. The aforementioned processes are examples, and processing unit 602 may perform other processes. Other programming modules that may be used in accordance with embodiments of the present invention may include electronic mail and contacts applications, word processing applications, spreadsheet applications, database applications, slide presentation applications, drawing or computer-aided application programs, etc.
Generally, consistent with embodiments of the invention, program modules may include routines, programs, components, data structures, and other types of structures that may perform particular tasks or that may implement particular abstract data types. Moreover, embodiments of the invention may be practiced with other computer system configurations, including hand-held devices, multiprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, mainframe computers, and the like. Embodiments of the invention may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote memory storage devices.
Furthermore, embodiments of the invention may be practiced in an electrical circuit comprising discrete electronic elements, packaged or integrated electronic chips containing logic gates, a circuit utilizing a microprocessor, or on a single chip (such as a System on Chip) containing electronic elements or microprocessors. Embodiments of the invention may also be practiced using other technologies capable of performing logical operations such as, for example, AND, OR, and NOT, including but not limited to mechanical, optical, fluidic, and quantum technologies. In addition, embodiments of the invention may be practiced within a general purpose computer or in any other circuits or systems.
Embodiments of the present invention, for example, are described above with reference to block diagrams and/or operational illustrations of methods, systems, and computer program products according to embodiments of the invention. The functions/acts noted in the blocks may occur out of the order as shown in any flowchart. For example, two blocks shown in succession may in fact be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the functionality/acts involved.
While certain embodiments of the invention have been described, other embodiments may exist. Furthermore, although embodiments of the present invention have been described as being associated with data stored in memory and other storage mediums, data can also be stored on or read from other types of computer-readable media, such as secondary storage devices, like hard disks, floppy disks, or a CD-ROM, or other forms of RAM or ROM. Further, the disclosed methods' stages may be modified in any manner, including by reordering stages and/or inserting or deleting stages, without departing from the invention.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.
Patent | Priority | Assignee | Title |
11528401, | Jul 13 2012 | SEESCAN, INC | Pipe inspection systems with self-grounding portable camera controllers |
11639661, | Aug 12 2019 | THE CHARLES MACHINE WORKS, INC | Augmented reality system for use in horizontal directional drilling operations |
11803863, | Dec 13 2021 | CROP SEARCH LLC | System and method for food safety audit and sampling |
9235823, | Nov 05 2012 | Bernsten International, Inc.; BERNTSEN INTERNATIONAL, INC | Underground asset management system |
ER6522, |
Patent | Priority | Assignee | Title |
6094625, | Jul 03 1997 | Trimble Navigation Limited | Augmented vision for survey work and machine control |
6815411, | Nov 20 2000 | Procter & Gamble Company, The | Fabric softening compositions and methods |
6975942, | Jun 14 2000 | Vermeer Manufacturing Company | Underground utility detection system and method |
7319387, | Mar 17 2004 | 3M Innovative Properties Company | GPS interface for locating device |
7482973, | Jul 20 2004 | PROSTAR GEOCORP, INC | Precision GPS driven utility asset management and utility damage prevention system and method |
7978125, | Sep 26 2008 | MBDA ITALIA S P A | Method for processing a radar echo signal |
7978129, | Mar 14 2006 | PROSTAR GEOCORP, INC | System and method for collecting and updating geographical data |
8155390, | Mar 18 2008 | Certusview Technologies, LLC | Methods and apparatus for providing unbuffered dig area indicators on aerial images to delimit planned excavation sites |
20020184235, | |||
20090204614, | |||
20090204625, | |||
20120105440, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Apr 23 2015 | HADDY, ALAN | IPEG Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 035691 | /0149 | |
Sep 13 2021 | IPEG Corporation | UTTO INC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 057511 | /0649 |
Date | Maintenance Fee Events |
Sep 02 2016 | REM: Maintenance Fee Reminder Mailed. |
Dec 28 2016 | M2551: Payment of Maintenance Fee, 4th Yr, Small Entity. |
Dec 28 2016 | M2554: Surcharge for late Payment, Small Entity. |
Jul 22 2020 | M2552: Payment of Maintenance Fee, 8th Yr, Small Entity. |
Sep 09 2024 | REM: Maintenance Fee Reminder Mailed. |
Date | Maintenance Schedule |
Jan 22 2016 | 4 years fee payment window open |
Jul 22 2016 | 6 months grace period start (w surcharge) |
Jan 22 2017 | patent expiry (for year 4) |
Jan 22 2019 | 2 years to revive unintentionally abandoned end. (for year 4) |
Jan 22 2020 | 8 years fee payment window open |
Jul 22 2020 | 6 months grace period start (w surcharge) |
Jan 22 2021 | patent expiry (for year 8) |
Jan 22 2023 | 2 years to revive unintentionally abandoned end. (for year 8) |
Jan 22 2024 | 12 years fee payment window open |
Jul 22 2024 | 6 months grace period start (w surcharge) |
Jan 22 2025 | patent expiry (for year 12) |
Jan 22 2027 | 2 years to revive unintentionally abandoned end. (for year 12) |