A system includes one or more components within an internal cabin of a vehicle. An imaging device is configured to obtain an image of the one or more components. A state determination control unit includes a processor. The state determination control unit is in communication with the imaging device. The state determination control unit receives image data including the image from the imaging device. Further, the state determination control unit determines a state of the one or more components based on the image data.
|
20. A non-transitory computer-readable storage medium comprising executable instructions that, in response to execution, cause a system comprising a processor, to perform operations comprising:
receiving image data comprising an image of components within a row within an internal cabin of a vehicle from an imaging device within the internal cabin;
determining a state of the one or more components based on the image data including the image of the components in the row, wherein said determining comprises generating, within the image, a first end line at a first end of the row and a second end line at a second end of the row opposite from the first end line, and generating, within the image, an upper boundary line associated with an upper edge of the row, and a lower boundary line associated with a lower edge of the row, wherein the first end line and the upper boundary line intersect at a first corner, wherein the first end line and the lower boundary line intersect at a second corner, wherein the second end line and the upper boundary line intersect at a third corner, wherein the second end line and the lower boundary line intersect at a fourth corner, and wherein the first corner, the second corner, the third corner, and the fourth corner are corners of a quadrilateral that bounds the plurality of components; and
outputting a component control signal that automatically operates the components based on the state of the components.
14. A method comprising:
obtaining, by an imaging device, an image of components in a row within a vehicle;
receiving, by a state determination control unit comprising a processor that is in communication with the imaging device, image data including the image of the components in the row from the imaging device; and
determining, by the state determination control unit, a state of the components based on the image data including the image of the components in the row, wherein said determining comprises generating, within the image, a first end line at a first end of the row and a second end line at a second end of the row opposite from the first end line, and generating, within the image, an upper boundary line associated with an upper edge of the row, and a lower boundary line associated with a lower edge of the row, wherein the first end line and the upper boundary line intersect at a first corner, wherein the first end line and the lower boundary line intersect at a second corner, wherein the second end line and the upper boundary line intersect at a third corner, wherein the second end line and the lower boundary line intersect at a fourth corner, and wherein the first corner, the second corner, the third corner, and the fourth corner are corners of a quadrilateral that bounds the plurality of components, and
wherein the state comprises one or both of an open state or a closed state, and wherein the one or more components are one or more overhead stowage bin assemblies.
1. A system comprising:
components within an internal cabin of a vehicle, wherein the components are in a row;
an imaging device configured to obtain an image of the components in the row; and
a state determination control unit comprising a processor, wherein the state determination control unit is in communication with the imaging device,
wherein the state determination control unit is configured to receive image data including the image of the components from the imaging device, and
wherein the state determination control unit is configured to determine a state of the components based on the image data including the image of the components in the row, wherein the state determination control unit is further configured to:
generate, within the image, a first end line at a first end of the row and a second end line at a second end of the row opposite from the first end line, and
generate, within the image, an upper boundary line associated with an upper edge of the row, and a lower boundary line associated with a lower edge of the row, wherein the first end line and the upper boundary line intersect at a first corner, wherein the first end line and the lower boundary line intersect at a second corner, wherein the second end line and the upper boundary line intersect at a third corner, wherein the second end line and the lower boundary line intersect at a fourth corner, and wherein the first corner, the second corner, the third corner, and the fourth corner are corners of a quadrilateral that bounds the components.
2. The system of
4. The system of
5. The system of
6. The system of
7. The system of
8. The system of
9. The system of
10. The system of
11. The system of
12. The system of
13. The system of
15. The method of
determining a region of interest within the image data;
determining the region of interest by determining a vanishing point within the image; and
determining a bisecting line that passes through the vanishing point, and determining the region of interest by the bisecting line.
16. The method of
17. The method of
geometrically transforming the quadrilateral into a rectangle having respective corresponding corners to form a perspective corrected image.
18. The method of
extrapolating a shape of each of the components within the image data;
determining the state of the components through clustering; and
determining a difference in image attributes.
19. The method of
outputting, by the state determination control unit, a state signal indicative of the state of the components to a user device; and
outputting, by the state determination control unit, a component control signal that automatically operates the components based on the state of the components.
21. The non-transitory computer-readable storage medium of
|
This application is a non-provisional application of U.S. Patent Application No. 63/057,943, filed 29 Jul. 2020, and the entire disclosure of which is incorporated herein by reference.
Embodiments of the subject disclosure generally relate to systems and methods for determining a state of a component, such as a stowage bin assembly, within an internal cabin of a vehicle.
Aircraft are used to transport passengers and cargo between various locations. An internal cabin of an aircraft includes numerous components that may be moved between open and closed states. For example, overhead stowage bin assemblies in an internal cabin include bins moveable between open and closed states. As another example, an exit door is moveable between an open and closed state. As another example, a door of a lavatory within an internal cabin is moveable between and open and closed state. As another example, a galley cart compartment may be in an open state, in which a galley cart can be moved into the compartment, and a closed state, in which a galley cart is already within the compartment. In general, an internal cabin of a vehicle of a commercial aircraft includes various components that are moveable between open and closed states.
During different stages of a flight, certain components are in a particular state. For example, during a boarding process, stowage bin assemblies are in an open state to allow for passengers to stow luggage and other personal items. As a departure time approaches, a flight attendant typically monitors the stowage bin assemblies and ensures that each is closed before the aircraft moves onto a taxiway and/or a runway. That is, an individual such as a flight attendant views and determines if any stowage bin assemblies are open. As can be appreciated, the process of walking through an internal cabin and visually determining which stowage bin assemblies are open and closed takes time. Further, in the event a flight attendant becomes distracted, such as if a passenger needs assistance, an open stowage bin assembly may be overlooked.
Some stowage bin assemblies include or otherwise coupled to sensors that are connected to a monitoring device through wiring. However, the sensors and wiring add weight to an aircraft. The sensors and wiring increase manufacturing complexity as the sensors and wiring need to be positioned and/or routed. Further, the additional weight from the sensors and wiring reduces fuel efficiency.
A need exists for a system and a method for automatically determining a state of a component within an internal cabin of a vehicle. Further, a need exists for a system and a method for automatically determining whether stowage bin assemblies within an internal cabin of a vehicle are open or closed. Additionally, a need exists for a system and a method of monitoring a state of a component within an internal cabin that does not substantially add weight to a vehicle.
With those needs in mind, certain embodiments of the subject disclosure provide a system including one or more components within an internal cabin of a vehicle, an imaging device configured to obtain an image of the one or more components, and a state determination control unit comprising a processor. The state determination control unit is in communication with the imaging device. The state determination control unit receives image data including the image from the imaging device, and determines a state of the one or more components based on the image data.
In at least one embodiment, the state includes one or both of an open state or a closed state. In at least one embodiment, the one or more components are one or more overhead stowage bin assemblies.
In at least one embodiment, the state determination control unit determines a region of interest within the image data. For example, the state determination control unit determines the region of interest by determining a vanishing point within the image. As a further example, the state determination control unit determines a bisecting line that passes through the vanishing point. The region of interest is determined by the bisecting line.
In at least one embodiment, the state determination control unit corrects for perspective within the image data. For example, the one or more components include a plurality of components in a row. The state determination control unit generates a first end line at a first end of the row and a second end line at a second end of the row opposite from the first end line. The state determination control unit further generates an upper boundary line associated with an upper edge of the row, and a lower boundary line associated with a lower edge of the row. The first end line and the upper boundary line intersect at a first corner. The first end line and the lower boundary line intersect at a second corner. The second end line and the upper boundary line intersect at a third corner. The second end line and the lower boundary line intersect at a fourth corner. The first corner, the second corner, the third corner, and the fourth corner are corners of a quadrilateral that bounds the plurality of components. The state determination control unit geometrically transforms the quadrilateral into a rectangle having respective corresponding corners to form a perspective corrected image.
In at least one embodiment, the state determination control unit extrapolates a shape of each of the one or more components within the image data.
In at least one embodiment, the state determination control unit determines the state of the one or more components through clustering.
In at least one embodiment, the state determination control unit determines the state of the one or more components by determining a difference in image attributes.
In at least one example, the system further includes a user device. The state determination control unit outputs a state signal indicative of the state of the one or more components to the user device.
In at least one example, the state determination control unit outputs a component control signal that automatically operates the one or more components based on state of the one or more components.
Certain embodiments of the subject disclosure provide a method including obtaining, by an imaging device, an image of one or more components within an internal cabin of a vehicle, receiving, by a state determination control unit comprising a processor that is in communication with the imaging device, image data including the image from the imaging device, and determining, by the state determination control unit, a state of the one or more components based on the image data.
Certain embodiments of the subject disclosure provide a non-transitory computer-readable storage medium comprising executable instructions that, in response to execution, cause a system comprising a processor, to perform operations including receiving image data including an image of one or more components within an internal cabin of a vehicle from an imaging device within the internal cabin, and determining a state of the one or more components based on the image data.
The foregoing summary, as well as the following detailed description of certain embodiments will be better understood when read in conjunction with the appended drawings. As used herein, an element or step recited in the singular and preceded by the word “a” or “an” should be understood as not necessarily excluding the plural of the elements or steps. Further, references to “one embodiment” are not intended to be interpreted as excluding the existence of additional embodiments that also incorporate the recited features. Moreover, unless explicitly stated to the contrary, embodiments “including,” “comprising,” or “having” an element or a plurality of elements having a particular condition can include additional elements not having that condition.
Certain embodiments of the subject disclosure provide a system and a method for determining a state (such as an open state and a closed state) of a component (such as a stowage bin assembly) within an internal cabin of a vehicle (such as an aircraft). The system and method are configured to detect a state of a component using in-cabin video cameras in real-time to automatically log these events, trigger further procedures and alert flight crew with event information.
Certain embodiments of the subject disclosure provide a system and a method for automatically detecting cabin events using video feed available from on-board cameras to automatically log, trigger future events and alert crew about event information. By processing the events in multiple flights, the system and method are able to detect potential events which are abnormal.
In at least one embodiment, the system and method include viewing an in-cabin camera input, determining regions of interest, calculating a vanishing point, applying an algorithm to correct perspective, extrapolating bin location, clustering components to identify a state, and alerting crew of status. By analyzing image data obtained from one or more imaging devices to detect a state of a component, embodiments of the subject disclosure are configured to operate without the use of sensors and wiring, thereby reducing manufacturing cost and complexity, as well as increasing fuel efficiency.
In at least one embodiment, the vehicle 106 is an aircraft. As another example, the vehicle 106 is a land-based vehicle, such as a bus, passenger train car, or the like. As another example, the vehicle 106 is a watercraft or a spacecraft
The system 100 includes an imaging device 108 within the internal cabin 104. The imaging device 108 has a field of view 110. The component(s) 102 are within the field of view 110. In at least one embodiment, the imaging device 108 is a video camera. As another example, the imaging device 108 is a night vision camera. As another example, the imaging device 108 is an infrared camera.
The imaging device 108 can be a fixed camera within the internal cabin 104. For example, the imaging device 108 can be secured to a ceiling, wall, monument, or the like within the internal cabin 104. As another example, the imaging device 108 can be a mobile camera, such as a handheld camera, such as part of a smart phone or smart tablet.
The system 100 further includes a state determination control unit 112 in communication with the imaging device 108. For example, the state determination control unit 112 wirelessly communicates with the imaging device 108, such as through a Bluetooth, Wifi, or other such connection. Alternatively, the state determination control unit 112 can be in communication with the imaging device 108 through a wired connection.
In at least one embodiment, the component(s) 102 can include an actuator 114, such as an electric motor, that is configured to automatically move the component(s) 102 between an open state and a closed state. Further, the state determination control unit 112 wirelessly communicates with the actuator 114, in order to control opening and closing of the component(s) 102. Optionally, the state determination control unit 112 communicates with the actuator 114 through a wired connection. Also, optionally, the state determination control unit 112 is not in communication with the actuator 114. Also, optionally, the component 102 may not include the actuator 114.
In at least one embodiment, a user device 116 is also within the internal cabin 104. The user device 116 includes a display 118 coupled to an interface 120. For example, the user device 116 is a computer workstation within the internal cabin 104. As another example, the user device 116 is a handheld device, such as a smart phone or smart tablet. The display 118 can be a monitor or screen. The interface 120 can include a keyboard, mouse, and/or the like. In at least one embodiment, the display 118 and the interface 120 are integrated together into a touchscreen interface. The state determination control unit 112 is in communication with the user device 116, such as through a wired or wireless connection.
As described herein, the system 100 includes one or more components 102 within the internal cabin 104 of the vehicle 106. The imaging device 108 is configured to obtain an image of the one or more components 102. The state determination control unit 112 is in communication with the imaging device 108. The state determination control unit 112 receives the image data 122 including the image from the imaging device 108. The state determination control unit 112 determines a state (such as an open state or a closed state) of the one or more components 102 based on the image data 122.
In operation, the imaging device 108 obtains an image of the component(s) 102. The image may be a video image or a still photograph image. The imaging device 108 obtains the image of the component(s) 102 as image data 122. The state determination control unit 112 receives the image data 122 that includes the image of the component(s) 102 from the imaging device 108.
The state determination control unit 112 analyzes the image data 122 to determine a state of the component(s) 102. For example, the state determination control unit 112 analyzes the image data 122 to determine if the component(s) 102 (such as an overhead stowage bin assembly) is in an open state (that is, opened) or a closed state (that is, closed).
In at least one embodiment, the state determination control unit 112 analyzes the image data 122 by determining a region of interest within the image data 122. For example, in at least one embodiment, the region of interest includes a row of components 102, such as a row of overhead stowage bin assemblies within the internal cabin 104.
After determining the region of interest, the state determination control unit 112 perspective corrects the image data 122 to provide a perspective corrected image of the image data 122. For example, the state determination control unit 112 corrects for the perspective of an image of three dimensional space. Alternatively, the state determination control unit 112 may not perspective correct the image data 122.
Next, the state determination control unit 112 extrapolates individual components 102 within the perspective corrected image. For example, during an initial calibration or set up, the state determination control unit 112 is programmed to recognize a size and shape of each individual component 102 within the image data 122 (such as within the perspective corrected image). In at least one embodiment, the state determination control unit 112 extrapolates a shape of each of the components 102 within the image data 122.
The state determination control unit 112 then determines a state of the components 102 through clustering, such as by determining a difference in image attributes (for example, differences in color, color intensity, brightness, and/or the like). For example, the components 102 within the perspective corrected image are clustered based on differences in color therebetween. If there is a single cluster in relation to the components 102 within the perspective corrected image, then the state determination control unit 112 determines that all of the components 102 are in the same state, such as an open state or a closed state. In at least one embodiment, the state determination control unit 112 is calibrated to determine an image attribute for a closed state. For example, an image of a component 102 having a first color is determined to be in a closed state. As such, if a single cluster of components 102 all sharing the same attribute, such as the same color or color intensity, correlated to a close state, then the state determination control unit 112 determines that all of the components 102 are closed. If, however, the single cluster of components 102 differs from the predetermined image attribute of a closed state, then the state determination control unit 112 determines that the components 102 are in an open state.
Further, the image data 122 may show that there are different clusters of images. For example, a first cluster corresponding to a first set of one or more components 102 has a first image attribute (such as a first color or intensity), and a second cluster corresponding to a second set of one or more components 102 has a second image attribute that differs from the first image attribute. Accordingly, the state determination control unit 112 determines that one of the first set or the second set is in an open state, while the other is in a closed state. The state determination control unit 112 determines which set is in a closed state or an open state based on a predetermined image attribute. For example, the state determination control unit 112 can be calibrated to recognize a first image attribute as the closed state, and a second image attribute differing from the first image attribute is therefore in the open state, or vice versa.
The state determination control unit 112 associates the first cluster and the second cluster with locations of the components 102 within the internal cabin 104. For example, the state determination control unit 112 is calibrated to locate and label each of the components 102 within the internal cabin 102, such as via an initial manual calibration or setup. Accordingly, based on image analysis, the state determination control unit 112 detects the state (such as open or closed) of the components 102 within the internal cabin 104.
In at least one embodiment, the state determination control unit 112 detects the state of the components 102 through clustering, as described herein. The state determination control unit 112 compares clusters within the image data 122 to differentiate between open and closed states. In such an embodiment, the state determination control unit 112 does not compare the image data 122 against historic image data regarding open and closed states. As such, the system 100 can operate using less memory (which would otherwise store the historic data), and less computing power, thereby providing for efficient operation.
The state determination control unit 112 then outputs a state signal 126 to the user device 116. The state signal 126 is indicative of the state of the component(s) 102. The state signal 126 includes state data for the components 102 within the internal cabin 104. For example, the state signal 126 indicates which of the components 102 are open and closed. The state signal 126 is received by the user device 116, which may provide video or graphic data on the display 118, an audio signal, or the like to alert an attendant as to which of the components 102 are open and closed. In this manner, the state determination control unit 112 automatically detects a state of the components 102, and alerts an attendant, who is then able to close or open the components 102 as desired.
In at least one embodiment, the state determination control unit 112 can automatically operate the components 102, such as through the actuators 114, based on the determined states of the components 102. For example, after the state determination control unit 112 analyzes the image data 122 to determine which of the components 102 are open and closed, the state determination control unit 112 outputs a component control signal 128 to the actuators 114 to selectively close open components 102 or open closed components 102, as desired. Thus, based on a determination of a state of a component 102, the state determination control unit 112 may automatically operate the component 102, such as via the actuator 114, to selectively move the component 102 between an open state and a closed state. Alternatively, the state determination control unit 112 is not configured to automatically operate the component(s) 102.
Alternatively, the state determination control unit 112 can include or otherwise be coupled to a memory that stores image data regarding components 102 in open and closed states (and optionally full, partially full, empty, latched, and unlatched states). In this embodiment, the state determination control unit 112 compares the image data 122 in relation to the stored image data to determine whether the component(s) 102 are in open or closed states.
As used herein, the term “control unit,” “central processing unit,” “unit,” “CPU,” “computer,” or the like can include any processor-based or microprocessor-based system including systems using microcontrollers, reduced instruction set computers (RISC), application specific integrated circuits (ASICs), logic circuits, and any other circuit or processor including hardware, software, or a combination thereof capable of executing the functions described herein. Such are exemplary only, and are thus not intended to limit in any way the definition and/or meaning of such terms. For example, the state determination control unit 112 can be or include one or more processors that are configured to control operation thereof, as described herein.
The state determination control unit 112 is configured to execute a set of instructions that are stored in one or more data storage units or elements (such as one or more memories), in order to process data. For example, the state determination control unit 112 can include or be coupled to one or more memories. The data storage units can also store data or other information as desired or needed. The data storage units can be in the form of an information source or a physical memory element within a processing machine. The one or more data storage units or elements can comprise volatile memory or nonvolatile memory, or can include both volatile and nonvolatile memory. As an example, the nonvolatile memory can comprise read only memory (ROM), programmable ROM (PROM), electrically programmable ROM (EPROM), electrically erasable PROM (EEPROM), and/or flash memory and volatile memory can include random access memory (RAM), which can act as external cache memory. The data stores of the disclosed systems and methods is intended to comprise, without being limited to, these and any other suitable types of memory.
The set of instructions can include various commands that instruct the state determination control unit 112 as a processing machine to perform specific operations such as the methods and processes of the various embodiments of the subject matter described herein. The set of instructions can be in the form of a software program. The software can be in various forms such as system software or application software. Further, the software can be in the form of a collection of separate programs, a program subset within a larger program or a portion of a program. The software can also include modular programming in the form of object-oriented programming. The processing of input data by the processing machine can be in response to user commands, or in response to results of previous processing, or in response to a request made by another processing machine.
The diagrams of embodiments herein can illustrate one or more control or processing units, such as the state determination control unit 112. It is to be understood that the processing or control units can represent circuits, circuitry, or portions thereof that can be implemented as hardware with associated instructions (e.g., software stored on a tangible and non-transitory computer readable storage medium, such as a computer hard drive, ROM, RAM, or the like) that perform the operations described herein. The hardware can include state machine circuitry hardwired to perform the functions described herein. Optionally, the hardware can include electronic circuits that include and/or are connected to one or more logic-based devices, such as microprocessors, processors, controllers, or the like. Optionally, the state determination control unit 112 can represent processing circuitry such as one or more of a field programmable gate array (FPGA), application specific integrated circuit (ASIC), microprocessor(s), and/or the like. The circuits in various embodiments can be configured to execute one or more algorithms to perform functions described herein. The one or more algorithms can include aspects of embodiments disclosed herein, whether or not expressly identified in a flowchart or a method.
As used herein, the terms “software” and “firmware” are interchangeable, and include any computer program stored in a data storage unit (for example, one or more memories) for execution by a computer, including RAM memory, ROM memory, EPROM memory, EEPROM memory, and non-volatile RAM (NVRAM) memory. The above data storage unit types are exemplary only, and are thus not limiting as to the types of memory usable for storage of a computer program.
Referring to
The state determination control unit 112 then determines a bisecting line 806 that passes through the vanishing point 804. The bisecting line 806 is parallel to side edges 808 of the image 800, and perpendicular to a bottom edge 810 and a top edge 812 of the image 800. The bisecting line 806 bisects the image 800 into a first half 814 and a second half 816, each of which is associated with a separate region of interest. Thus, the regions of interest are determined by the bisecting line 806.
The state determination control unit 112 generates a first end line 902 at one end of the row 901. For example, the first end line 902 can be a formed at a predetermined distance from the vanishing point 804 (shown in
The uniform size of each bin shape of the associated overhead stowage bin assemblies decreases complexity of analysis, and therefore reduces computing power. In this manner, the state determination control unit 112 operates with increased efficiency. In general, the uniform size of the bin shapes enables improved feature extraction for bin status determination.
Embodiments of the subject disclosure provide systems and methods that allow large amounts of data to be quickly and efficiently analyzed by a computing device. For example, an internal cabin of a vehicle can include numerous components that are movable between open and closed states, and which can be overlooked by attendants and other individuals within the internal cabin. As such, large amounts of data are being tracked and analyzed by the state determination control unit 112. The vast amounts of data are efficiently organized and/or analyzed by the state determination control unit 112, as described above. The state determination control unit 112 analyzes the data in a relatively short time in order to quickly and efficiently output information as to which particular components are opened or closed. A human being would be incapable of efficiently analyzing such amounts of data in such a short time. As such, embodiments of the subject disclosure provide increased and efficient functionality, and vastly superior performance in relation to a human being analyzing the data.
In at least one embodiment, components of the system 100, such as the state determination control unit 112, provide and/or enable a computer system to operate as a special computer system for detecting states of components within an internal cabin.
Optionally, instead of an aircraft, embodiments of the subject disclosure may be used with various other vehicles, such as automobiles, buses, locomotives and train cars, watercraft, and the like. Further, embodiments of the subject disclosure may be used with respect to fixed structures, such as commercial and residential buildings.
Referring to
Further, the disclosure comprises embodiments according to the following clauses:
Clause 1. A system comprising:
Clause 2. The system of Clause 1, wherein the state comprises one or both of an open state or a closed state.
Clause 3. The system of Clauses 1 or 2, wherein the one or more components are one or more overhead stowage bin assemblies.
Clause 4. The system of any of Clauses 1-3, wherein the state determination control unit determines a region of interest within the image data.
Clause 5. The system of any of Clauses 1-4, wherein the state determination control unit determines the region of interest by determining a vanishing point within the image.
Clause 6. The system of any of Clauses 1-5, wherein the state determination control unit determines a bisecting line that passes through the vanishing point, wherein the region of interest is determined by the bisecting line.
Clause 7. The system of any of Clauses 1-6 wherein the state determination control unit corrects for perspective within the image data.
Clause 8. The system of any of Clauses 1-7, wherein the one or more components comprise a plurality of components in a row, wherein the state determination control unit generates a first end line at a first end of the row and a second end line at a second end of the row opposite from the first end line, wherein the state determination control unit further generates an upper boundary line associated with an upper edge of the row, and a lower boundary line associated with a lower edge of the row, wherein the first end line and the upper boundary line intersect at a first corner, wherein the first end line and the lower boundary line intersect at a second corner, wherein the second end line and the upper boundary line intersect at a third corner, wherein the second end line and the lower boundary line intersect at a fourth corner, and wherein the first corner, the second corner, the third corner, and the fourth corner are corners of a quadrilateral that bounds the plurality of components.
Clause 9. The system of any of Clauses 1-8, wherein the state determination control unit geometrically transforms the quadrilateral into a rectangle having respective corresponding corners to form a perspective corrected image.
Clause 10. The system of any of Clauses 1-9, wherein the state determination control unit extrapolates a shape of each of the one or more components within the image data.
Clause 11. The system of any of Clauses 1-10, wherein the state determination control unit determines the state of the one or more components through clustering.
Clause 12. The system of any of Clauses 1-11, wherein the state determination control unit determines the state of the one or more components by determining a difference in image attributes.
Clause 13. The system of any of Clauses 1-12, further comprising a user device, wherein the state determination control unit outputs a state signal indicative of the state of the one or more components to the user device.
Clause 14. The system of any of Clauses 1-13, wherein the state determination control unit outputs a component control signal that automatically operates the one or more components based on state of the one or more components.
Clause 15. A method comprising:
Clause 16. The method of Clause 15, wherein the state comprises one or both of an open state or a closed state.
Clause 17. The method of Clauses 15 or 16, wherein the one or more components are one or more overhead stowage bin assemblies.
Clause 18. The method of any of Clauses 15-17, wherein said determining comprises determining a region of interest within the image data.
Clause 19. The method of any of Clauses 15-18, wherein said determining further comprises determining the region of interest by determining a vanishing point within the image.
Clause 20. The method of any of Clauses 15-19, wherein said determining further comprises determining a bisecting line that passes through the vanishing point, and determining the region of interest by the bisecting line.
Clause 21. The method of any of Clauses 15-20, further comprising correcting for perspective, by the state determination control unit, within the image data.
Clause 22. The method of any of Clauses 15-21, wherein the one or more components comprise a plurality of components in a row, and wherein said determining comprises:
Clause 23. The method of any of Clauses 15-22, wherein said determining further comprises geometrically transforming the quadrilateral into a rectangle having respective corresponding corners to form a perspective corrected image.
Clause 24. The method of any of Clauses 15-23, wherein said determining further comprises extrapolating a shape of each of the one or more components within the image data.
Clause 25. The method of any of Clauses 15-24, wherein said determining comprises determining the state of the one or more components through clustering.
Clause 26. The method of any of Clauses 15-25, wherein said determining comprises determining a difference in image attributes.
Clause 27. The method of any of Clauses 15-26, further comprising outputting, by the state determination control unit, a state signal indicative of the state of the one or more components to a user device.
Clause 28. The method of any of Clauses 15-27, further comprising outputting, by the state determination control unit, a component control signal that automatically operates the one or more components based on state of the one or more components.
Clause 29. A non-transitory computer-readable storage medium comprising executable instructions that, in response to execution, cause a system comprising a processor, to perform operations comprising:
Clause 30. The non-transitory computer-readable storage medium of Clause 29, wherein said determining comprises determining a region of interest within the image data.
Clause 31. The non-transitory computer-readable storage medium of Clauses 29 or 30, further comprising correcting for perspective within the image data.
Clause 32. The non-transitory computer-readable storage medium of any of Clauses 29-31, wherein said determining further comprises extrapolating a shape of each of the one or more components within the image data.
Clause 33. The non-transitory computer-readable storage medium of any of Clauses 29-32, wherein said determining further comprises determining the state of the one or more components through clustering.
While various spatial and directional terms, such as top, bottom, lower, mid, lateral, horizontal, vertical, front and the like can be used to describe embodiments of the subject disclosure, it is understood that such terms are merely used with respect to the orientations shown in the drawings. The orientations can be inverted, rotated, or otherwise changed, such that an upper portion is a lower portion, and vice versa, horizontal becomes vertical, and the like.
As used herein, a structure, limitation, or element that is “configured to” perform a task or operation is particularly structurally formed, constructed, or adapted in a manner corresponding to the task or operation. For purposes of clarity and the avoidance of doubt, an object that is merely capable of being modified to perform the task or operation is not “configured to” perform the task or operation as used herein.
It is to be understood that the above description is intended to be illustrative, and not restrictive. For example, the above-described embodiments (and/or aspects thereof) can be used in combination with each other. In addition, many modifications can be made to adapt a particular situation or material to the teachings of the various embodiments of the disclosure without departing from their scope. While the dimensions and types of materials described herein are intended to define the parameters of the various embodiments of the disclosure, the embodiments are by no means limiting and are exemplary embodiments. Many other embodiments will be apparent to those of skill in the art upon reviewing the above description. The scope of the various embodiments of the disclosure should, therefore, be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled. In the appended claims and the detailed description herein, the terms “including” and “in which” are used as the plain-English equivalents of the respective terms “comprising” and “wherein.” Moreover, the terms “first,” “second,” and “third,” etc. are used merely as labels, and are not intended to impose numerical requirements on their objects. Further, the limitations of the following claims are not written in means-plus-function format and are not intended to be interpreted based on 35 U.S.C. § 112(f), unless and until such claim limitations expressly use the phrase “means for” followed by a statement of function void of further structure.
This written description uses examples to disclose the various embodiments of the disclosure, including the best mode, and also to enable any person skilled in the art to practice the various embodiments of the disclosure, including making and using any devices or systems and performing any incorporated methods. The patentable scope of the various embodiments of the disclosure is defined by the claims, and can include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if the examples have structural elements that do not differ from the literal language of the claims, or if the examples include equivalent structural elements with insubstantial differences from the literal language of the claims.
Owyang, Ethan Carl, Tripathi, Shubham, Radhakrishnan, Venkatesh Babu, Rakesh, Mugaludi Ramesha, Rakesh, Mugaludi Ramesha, Savadamuthu, Madhanmohan, Savadamuthu, Madhanmohan
Patent | Priority | Assignee | Title |
Patent | Priority | Assignee | Title |
8955802, | Mar 26 2012 | Diehl Aviation Laupheim GmbH | Luggage storage compartment, aircraft with a luggage storage compartment, and method for operating the luggage storage compartment |
20050172450, | |||
20150241209, | |||
20170230620, | |||
20170309001, | |||
20180173962, | |||
20180257554, | |||
20190233113, | |||
20190248383, | |||
EP3335991, | |||
KR101951683, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Jul 29 2020 | RADHAKRISHNAN, VENKATESH BABU | The Boeing Company | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 056547 | /0180 | |
Jul 29 2020 | RAKESH, MUGALUDI RAMESHA | The Boeing Company | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 056547 | /0180 | |
Jul 29 2020 | SAVADAMUTHU, MADHANMOHAN | The Boeing Company | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 056547 | /0180 | |
Jul 29 2020 | TRIPATHI, SHUBHAM | The Boeing Company | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 056547 | /0180 | |
Jul 29 2020 | OWYANG, ETHAN CARL | The Boeing Company | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 056547 | /0180 | |
Jun 15 2021 | The Boeing Company | (assignment on the face of the patent) | / |
Date | Maintenance Fee Events |
Jun 15 2021 | BIG: Entity status set to Undiscounted (note the period is included in the code). |
Date | Maintenance Schedule |
Feb 13 2027 | 4 years fee payment window open |
Aug 13 2027 | 6 months grace period start (w surcharge) |
Feb 13 2028 | patent expiry (for year 4) |
Feb 13 2030 | 2 years to revive unintentionally abandoned end. (for year 4) |
Feb 13 2031 | 8 years fee payment window open |
Aug 13 2031 | 6 months grace period start (w surcharge) |
Feb 13 2032 | patent expiry (for year 8) |
Feb 13 2034 | 2 years to revive unintentionally abandoned end. (for year 8) |
Feb 13 2035 | 12 years fee payment window open |
Aug 13 2035 | 6 months grace period start (w surcharge) |
Feb 13 2036 | patent expiry (for year 12) |
Feb 13 2038 | 2 years to revive unintentionally abandoned end. (for year 12) |