This invention provides a system and method that employs reduction in bandwidth and the amount of data stored, along with queuing of data, so that the significant and/or relevant shipboard visual information is detected and communicated continuously from the ship (or other remote vehicle/location) to shore without loss of useful/high-priority information and within the available bandwidth of that typical, available satellite link. The system and method supports remote configuration and management from shore to the ship over the same communications channel but in the reverse direction. The system and method further facilitates the logical summarization of continuous visibility into shipboard activities, shipboard behavior, and shipboard status so that a land/shore-based manager/reviewer/auditor can review, comprehend and synthesize such information, at a glance, using a land-based user interface/dashboard, related to events that have recently transpired on the sailing commercial merchant vessel without the need to review hours/days/weeks of multiple channels of video.
|
26. A method for storage and bandwidth reduction of visual events in association with an activity comprising the steps of:
acquiring images of the activity with at least one camera and performing visual event detection with a processor;
identifying the visual events from software-based and hardware-based event detectors that identify predetermined characteristics within one or more of the images and determine a state based upon the characteristics and storing the visual events in a queue based upon predetermined parameters, wherein the visual events comprise activities related to at least one of bridge operations and safety operations with respect to a sea-going vessel; and
communicating with a remote location over a reduced-bandwidth communications channel adapted to transmit the stored visual events in a reduced-bandwidth and queued format to the remote location.
1. A system for storage and bandwidth reduction of visual events in association with an activity comprising:
at least one visual event detector having at least one camera that acquires images of the activity, at least one processor receiving the images and a visual event detection process operating at least in part on the processor that processes the images;
a queuing process that identifies the visual events from the at least one visual event detector and stores the visual events; and
a communications process that enables a reduced-bandwidth communications channel adapted to transmit the stored visual events in a reduced-bandwidth and queued format to a remote location,
wherein the visual events comprise activities related to at least one of bridge operations, and safety operations with respect to a vessel, and wherein the visual events are based upon software-based event detectors and hardware-based event detectors that identify predetermined characteristics within one or more of the images and determine a state based upon the characteristics.
18. A user interface operated by a base location processor located at a base location and receiving visual events over a communication channel from a remote, moving or stationary object, in which the object includes at least one visual event detector having at least one camera that acquires images of an activity, at least one object processor receiving the images and a visual event detection process operating at least in part on the at least one object processor that processes the images and a communications process that enables the communication channel adapted to transmit stored visual events in queued format to the base location, the user interface at the base location comprising:
a dashboard display for displaying information associated with the object,
wherein the visual events comprise activities related to at least one of bridge operations, safety operations, with respect to a vessel or fleet of vessels, and wherein the visual events are based upon software-based event detectors and hardware-based event detectors that identify predetermined characteristics within one or more of the images and determine a state based upon the characteristics.
2. The system as set forth in
3. The system as set forth in
4. The system as set forth in
5. The system as set forth in
6. The system as set forth in
7. The system as set forth in
8. The system as set forth in
9. The system as set forth in
10. The system as set forth in
11. The system as set forth in
12. The system as set forth in
13. The system as set forth in
14. The system as set forth in
15. The system as set forth in
16. The system as set forth in
17. The system as set forth in
19. The user interface as set forth in
20. The user interface as set forth in
21. The user interface as set forth in
22. The user interface as set forth in
23. The user interface as set forth in
24. The user interface as set forth in
25. The user interface as set forth in
|
This application is a continuation of International Application Serial No. PCT/US22/15564, entitled SYSTEM AND METHOD FOR BANDWIDTH REDUCTION AND COMMUNICATION OF VISUAL EVENTS, filed Feb. 8, 2022, which is a continuation of, and claims priority to, U.S. patent application Ser. No. 17/175,364, SYSTEM AND METHOD FOR BANDWIDTH REDUCTION AND COMMUNICATION OF VISUAL EVENTS, filed Feb. 12, 2012, now U.S. Pat. No. 11,132,552, issued Sep. 28, 2021, the teachings of each of which applications are expressly incorporated herein by reference.
This application relates to systems and methods for communicating visual data and related events over a reduced bandwidth communication link, and user interfaces for viewing and manipulating such data and events.
International shipping is a critical part of the world economy. Ocean-going, merchant freight vessels are employed to carry virtually all goods and materials between ports and nations. The current approach to goods shipments employs intermodal cargo containers, which are loaded and unloaded from the deck of ships, and are carried in a stacked configuration. Freight is also shipped in bulk carriers (e.g. grain) or liquid tankers (e.g. oil). The operation of merchant vessels can be hazardous and safety concerns are always present. Likewise, passenger vessels, with the precious human cargo are equally, if not more, concerned with safety of operations and adherences to rules and regulations by crew and passengers. Knowledge of the current status of the vessel, crew and cargo can be highly useful in ensuring safe and efficient operation.
In many areas of commercial and/or government activity, visual monitoring (manual and automated surveillance), and other status sensors is employed to ensure safe and rule-conforming operation. These approaches, however, entail the generation and transmission of large volumes of data to a local or remote location, where such data is stored and/or analyzed by management personnel. However, unlike most land-based (i.e. wired, fiber or high-bandwidth wireless) communication links, it is often much more challenging to transmit useful data (e.g. visual information) from ship-to-shore. It can be assumed that ten (10) channels of raw video data are the minimum number needed to provide shipboard visibility and a 5 Mb/s of uplink speed per HD video channel or 50 Mb/s in aggregate is required. Conversely a typical satellite link to/from a ship is 1/200 or 1/400 of that size, and transmits only 256 Kb/s or 128 Kb/s.
It is desirable to provide a system and method that enables continuous visibility into the shipboard activities, shipboard behavior, and shipboard status of an at-sea commercial merchant vessel (cargo, fishing, industrial, and passenger). It is further desirable that that the transmitted visual data and associated status be accessible via an interface that aids users in manipulating, organizing and acting upon such information.
This invention overcomes the disadvantages of the prior art by providing a system and method that employs reduction in bandwidth and the amount of data stored bandwidth, along with queuing of data so that the significant and/or relevant shipboard visual information is detected and communicated continuously from the ship (or other remote vehicle/location) to shore without (free of) the loss of useful/high-priority information and within the available bandwidth of that typical, available satellite link. The system and method supports remote configuration and management from shore to the ship over the same communications channel but in the reverse direction. The system and method further facilitates the logical summarization of continuous visibility into shipboard activities, shipboard behavior, and shipboard status so that a land/shore-based manager/reviewer/auditor can review, comprehend and synthesize such information, at a glance, using a land-based computer graphical user interface (GUI), which can be defined by a dashboard, related to events that have recently transpired on the sailing commercial merchant vessel without the need to review hours/days/weeks of (e.g.) 10+ channels of video. Notably, the system and method effectively provides that visibility to the land-based computer dashboard.
In an illustrative embodiment, a system and method for storage and bandwidth reduction of visual events in association with an activity is provided. At least one visual event detector is provided, and can include at least one camera that acquires images of the activity. At least one processor receives the images and a visual event detection process operates at least in part on the processor that processes the images. A queuing process identifies visual events from the at least one visual event detector and stores the visual events. A communications process enables a reduced-bandwidth communications channel adapted to transmit the stored visual events in a reduced-bandwidth and queued format to a remote location. Illustratively, the visual events can comprise activities related to at least one of bridge operations, cargo operations, maintenance operations, safety operations, and vetting and security with respect to a vessel. The visual events can be based upon software-based and hardware-based event detectors that identify predetermined characteristics within one or more of the images and determine a state based upon the characteristics. The visual events can provide data primitives that are derived into user information, and these data primitives can, thus, include at least one of statistics, pattern matches, metrics and time. The stored visual events can be communicated across the communications channel based upon user-set priorities, wherein higher priority events are communicated first in a queue from the stored visual events. A user interface can display information of the at least one of the bridge operations, cargo operations, maintenance operations, safety operations, and vetting and security at a base location in one or more predetermined presentation formats. The user interface can include inputs that enable the user-set priorities. The user-interface can include inputs that enable adding or setting configuration parameters for the event detectors. The user information can also include at least one of fleet reports on a plurality of vessels associated with the user interface, vessel reports, occurrence of discrete events and occurrence of complex events comprising a plurality of discrete events. The information can be displayed in a dashboard format comprising one or more visual windows, alerts and numerical data.
In an illustrative embodiment, a user interface is provided, which is located at a base location and receives visual events over a communication channel from a remote, moving or stationary object, in which the object includes at least one visual event detector having at least one camera that acquires images of the activity. At least one processor is provided, which receives the images, and a visual event detection process operates, at least in part, on the processor to process the images. A communications process enables a communications channel adapted to transmit the stored visual events in a queued format to the base location. The user interface includes a dashboard for displaying information associated with the object. The user interface enables configurations of at least one of (a) priority of visual events to be communicated over the communications channel and (b) configuration of characteristics associated with the visual events. The communications channel can be a reduced-bandwidth, wireless communications channel. The object can be a sea-going vessel and the visual events can relate to at least one of safety, maintenance, cargo handling, bridge operations, vetting and security relative to the vessel. A dashboard display on the user interface can provide at least one of configuration, status and performance information relative to the vessel. The dashboard display on the user interface can provide at least one of configuration, status and performance information relative to each vessel of a fleet of vessels. Illustratively, the information includes one or more forms of displayed sensor data, having at least one of a label, highlight, comment and annotation visually appended thereto. The information handled and/or displayed by the interface can include a workflow provided between one or more users or vessels. The communication channel can be arranged to transmit control data from the user interface to one or more of the vessels and receive feedback via the user interface relative to actions on the vessel with respect to the control data.
In an illustrative embodiment, a method for storage and bandwidth reduction of visual events in association with an activity is provided. The method acquires images of the activity with at least one camera and performing visual event detection with a processor. Visual events from the at least one visual event detector are identified and stored in a queue based upon predetermined parameters. Communication with a remote location occurs over a reduced-bandwidth communications channel adapted to transmit the stored visual events in a reduced-bandwidth and queued format to the remote location.
The invention description below refers to the accompanying drawings, of which:
Note that data used herein can include both direct feeds from appropriate sensors and also data feeds from other data sources that can aggregate various information, telemetry, etc. For example, location and/or directional information can be obtained from navigation systems (GPS etc.) or other systems (e.g. via APIs) through associated data processing devices (e.g. computers) that are networked with a server 130 for the system. Similarly, crew members can input information via an appropriate user interface. The interface can request specific inputs—for example logging into or out of a shift, providing health information, etc.—or the interface can search for information that is otherwise input by crew during their normal operations—for example, determining when a crew member is entering data in the normal course of shipboard operations to ensure proper procedures are being attended to in a timely manner.
The shipboard location 110 can further include a local image/other data recorder 120. The recorder can be a standalone unit, or part of a broader computer server arrangement 130 with appropriate processor(s), data storage and network interfaces. The server 130 can perform generalized shipboard, or dedicated, to operations of the system and method herein with appropriate software. The server 130 communicates with a work station or other computing device 132 that can include an appropriate display (e.g. a touchscreen) 134 and other components that provide a graphical user interface (GUI). The GUI provides a user on board the vessel with a local dashboard for viewing and controlling manipulation of event data generated by the sensors 118 as described further below. Note that display and manipulation of data can include, but is not limited to enrichment of the displayed data (e.g. images, video, etc.) with labels, comments, flags, highlights, and the like.
The information handled and/or displayed by the interface can include a workflow provided between one or more users or vessels. Such a workflow would be a business process where information is transferred from user to user (at shore or at sea interacting with the application over the GUI) for action according to the business procedures/rules/policies. This workflow automation is commonly referred to as “robotic process automation.”
The processes 150 that run the dashboard and other data-handling operations in the system and method can be performed in whole or in part with the on-board server 130, and/or using a remote computing (server) platform 140 that is part of a land-based, or other generally fixed, location with sufficient computing/bandwidth resources (a base location 142). The processes can generally include 150 a computation process 152 that handles sensor data to meaningful events. This can include machine vision algorithms and similar procedures. A data-handling process 154 can be used to derive events and associated status based upon the events—for example movements of the crew and equipment, cargo handling, etc. An information process 156 can be used to drive dashboards for one or more vessels and provide both status and manipulation of data for a user on the ship and at the base location.
Data is communicated between the ship (or other remote location) 110 and the base 142 occurs over one or more reduced-bandwidth wireless channels, which can be facilitated by a satellite uplink/downlink 160, or another transmission modality—for example, long-wavelength, over-air transmission. Moreover, other forms of wireless communication can be employed such as mesh networks and/or underwater communication (for example long-range, sound-based communication and/or VLF). Note that when the ship is located near a land-based high-bandwidth channel or physically connected by-wire while at port, the system and method herein can be adapted to utilize that high-bandwidth channel to send all previously unsent low-priority events, alerts, and/or image-based information.
The (shore) base server environment 140 communicates via an appropriate, secure and/or encrypted link (e.g. a LAN or WAN (Internet)) 162 with a user workstation 170 that can comprise a computing device with an appropriate GUI arrangement, which defines a user dashboard 172 allowing for monitoring and manipulation of one or more vessels in a fleet over which the user is responsible and manages.
Referring further to
Referring again to
Note that, in various embodiments, the bandwidth of the communications link between vessel and base can be limited by external systems such as QoS-quality of service-settings on routers/link OR by the internal system (edge server 130)—for example to limit usage to (e.g.) 15% of total available communication bandwidth. This limitation in bandwidth can be based on a variety of factors, including, but not limited to, the time of day and/or a communications satellite usage cost schedule. An appropriate instruction set can be programmed into the server using conventional or custom control processes. The specific settings for such bandwidth control can also be directed by the user via the GUI.
As shown in
As shown in
Note that the above-recited listing of examples (a j) are only some of a wide range of possible interactions that can for the basis of detectors according to illustrative embodiments herein. Those of skill should understand that other detectable events involving person-to-person, person-to-equipment or equipment-to-equipment interaction are expressly contemplated.
In operation, an expected event visual detector takes as input the detection result of one or more vision systems aboard the vessel. The result could be a detection, no detection, or an anomaly at the time of the expected event according to the plan. Multiple events or multiple detections can be combined into a higher-level single events. For example, maintenance procedures, cargo activities, or inspection rounds may result from combining multiple events or multiple detections. Note that each visual event is associated with a particular (or several) vision system camera(s) 118, 180, 182 at a particular time and the particular image or video sequence at a known location within the vessel. The associated video can be optionally sent or not sent with each event or alarm. When the video is sent with the event or alarm, it may be useful for later validation of the event or alarm. Notably, the discrete images and/or short-time video frame sequences actually represent a small fraction of the video stream, and consequently represent a substantial reduction in the bandwidth required for transmission in comparison to the entire video sequence over the reduced-bandwidth link. Moreover, in addition to compacting the video by reducing it to a few images or short-time sequence, the system can reduce the images in size either by cropping the images down to significant or meaningful image locations required by the detector or by reducing the resolution say from the equivalent of high-definition (HD) resolution to standard-definition (SD) resolution, or below standard resolution.
In addition to reducing bandwidth by identifying events via the vision system and cropping such images where appropriate, the number of image frames can be reduced, in a sequence thereof, by increasing the interval of time between frames. Moreover, bandwidth can be even further reduced using the procedures above, and then subjecting (all on the shipboard server side) the event-centric, cropped, spaced-apart, using commercially available or customized lossy or lossless image compression techniques. Such techniques can include, but are not limited to discrete cosine transform (DCT), run-length encoding (RLE), predictive coding, and/or Lempel-Ziv-Welch (LZW).
The images or video sequences NOT associated with visual events may be stored for some period of time on board the vessel.
The shipboard server establishes a priority of transmission for the processed visual events that is based upon settings provided from a user, typically operating the on-shore (base) dashboard. The shipboard server buffers these events in a queue in storage that can be ordered based upon the priority. Priority can be set based on a variety of factors—for example personnel safety and/or ship safety can have first priority and maintenance can have last priority, generally mapping to the urgency of such matters. By way of example, all events in the queue with highest priority are sent first. They are followed by events with lower priority. If a new event arrives shipboard with higher priority, then that new higher priority event will be sent ahead of lower priority events. It is contemplated that the lowest priority events can be dropped if higher priority events take all available bandwidth. The shipboard server receives acknowledgements from the base server on shore and confirms that events have been received and acknowledged on shore before marking the shipboard events as having been sent. Multiple events may be transmitted prior to receipt (or lack of receipt) of acknowledgement. Lack of acknowledgement potentially stalls the queue or requires retransmission of an event prior to transmitting all next events in the priority queue on the server. The shore-based server interface can configure or select the visual event detectors over the communications link. In addition to visual events, the system can transmit non-visual events like a fire alarm signal or smoke alarm signal.
As shown in
Other exemplary detection flows can be provided as appropriate to generate desired information on activities of interest by the ship's personnel and systems. Such detection flows employ relevant detector types, parameters, etc.
A. Status and Performance
B. Event Tracking
C. Reports
It should be clear that the above-described system and method effectively provides a user with meaningful information on the status and performance of various activities that are critical to safe and efficient operation of a vessel. This information includes relevant visual information that can be highly beneficial in understanding activities and providing evidence for follow-on analysis, etc. The system and method can operate to deliver meaningful visual and other information in near real time over small bandwidth communication channels. A graphical user interface can be provided to control the system and method, setting priorities, tracking and reporting activities as appropriate. The system and method can also be adapted to operate over larger bandwidth channels and/or with other types of remote locations—for example islands, subterranean/underground, arctic and Antarctic stations, space-based locations, etc.
The foregoing has been a detailed description of illustrative embodiments of the invention. Various modifications and additions can be made without departing from the spirit and scope of this invention. Features of each of the various embodiments described above may be combined with features of other described embodiments as appropriate in order to provide a multiplicity of feature combinations in associated new embodiments. Furthermore, while the foregoing describes a number of separate embodiments of the apparatus and method of the present invention, what has been described herein is merely illustrative of the application of the principles of the present invention. For example, as used herein, the terms “process” and/or “processor” should be taken broadly to include a variety of electronic hardware and/or software-based functions and components (and can alternatively be termed functional “modules” or “elements”). Moreover, a depicted process or processor can be combined with other processes and/or processors or divided into various sub-processes or processors. Such sub-processes and/or sub-processors can be variously combined according to embodiments herein. Likewise, it is expressly contemplated that any function, process and/or processor herein can be implemented using electronic hardware, software consisting of a non-transitory computer-readable medium of program instructions, or a combination of hardware and software. Additionally, as used herein various directional and dispositional terms such as “vertical”, “horizontal”, “up”, “down”, “bottom”, “top”, “side”, “front”, “rear”, “left”, “right”, and the like, are used only as relative conventions and not as absolute directions/dispositions with respect to a fixed coordinate space, such as the acting direction of gravity. Additionally, where the term “substantially” or “approximately” is employed with respect to a given measurement, value or characteristic, it refers to a quantity that is within a normal operating range to achieve desired results, but that includes some variability due to inherent inaccuracy and error within the allowed tolerances of the system (e.g. 1-5 percent). Accordingly, this description is meant to be taken only by way of example, and not to otherwise limit the scope of this invention.
Naslavsky, Ilan, Michael, David J., Perry, Osher, Cohen, Moran
Patent | Priority | Assignee | Title |
Patent | Priority | Assignee | Title |
10936907, | Aug 10 2018 | Buffalo Automation Group Inc. | Training a deep learning system for maritime applications |
11132552, | Feb 12 2021 | Shipin Systems Inc.; SHIPIN SYSTEMS INC | System and method for bandwidth reduction and communication of visual events |
9996749, | May 29 2015 | Accenture Global Solutions Limited | Detecting contextual trends in digital video content |
20020075546, | |||
20030025599, | |||
20040008253, | |||
20050055330, | |||
20070260363, | |||
20090102950, | |||
20110257819, | |||
20140059468, | |||
20170140603, | |||
20180239948, | |||
20180239982, | |||
20200012283, | |||
20200064466, | |||
20200264268, | |||
20200327345, | |||
20210174952, | |||
20220144392, | |||
20220253763, | |||
20220261483, | |||
20220396340, | |||
CN109819393, | |||
CN110363463, | |||
CN210464459, | |||
GB2609530, | |||
GB2609560, | |||
KR102320142, | |||
KR20130137876, | |||
KR20210019862, | |||
WO2022269609, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Aug 11 2023 | Shipin Systems Inc. | (assignment on the face of the patent) | / |
Date | Maintenance Fee Events |
Aug 11 2023 | BIG: Entity status set to Undiscounted (note the period is included in the code). |
Aug 30 2023 | SMAL: Entity status set to Small. |
Date | Maintenance Schedule |
Feb 20 2027 | 4 years fee payment window open |
Aug 20 2027 | 6 months grace period start (w surcharge) |
Feb 20 2028 | patent expiry (for year 4) |
Feb 20 2030 | 2 years to revive unintentionally abandoned end. (for year 4) |
Feb 20 2031 | 8 years fee payment window open |
Aug 20 2031 | 6 months grace period start (w surcharge) |
Feb 20 2032 | patent expiry (for year 8) |
Feb 20 2034 | 2 years to revive unintentionally abandoned end. (for year 8) |
Feb 20 2035 | 12 years fee payment window open |
Aug 20 2035 | 6 months grace period start (w surcharge) |
Feb 20 2036 | patent expiry (for year 12) |
Feb 20 2038 | 2 years to revive unintentionally abandoned end. (for year 12) |