augmentable and manipulable modeling of equipment provides a three dimensional, manipulable base model image of an equipment specimen on a display system using optical base model data obtained using a computing system. base model data collection can be user-controlled to permit changing spatial aspects of the base model image (size, perspective, orientation). User inputs concerning functions, conditions and the like can be transmitted from a model control unit to the computing system via a two-way communication link, generating augmenting data that is combined with the base model data to render augmented three dimensional models. User inputs also can be received directly by the computing system, for example using a display system touchscreen. Augmenting data received by the computing system and/or display system also can be transmitted via the communication link to the model control unit. Such implementations permit realistic simulation of the effects of a control system on a real world system, equipment, etc.
|
1. A method for generating a spatially manipulable and augmentable model of an industrial automation equipment on a display system, the method comprising:
in a computing system comprising a data acquisition device, receiving base model data comprising spatial data defining a spatial relationship between the data acquisition device and an image of the industrial automation equipment;
in the computing system, generating a base model image by processing the base model data;
in the computing system, receiving augmenting data from a model control unit external to the computing system, wherein the augmenting data comprises updated spatial data defining an updated spatial relationship between the data acquisition device and the industrial automation equipment; and
in the computing system, generating an augmented model image by modifying the base model image according to the augmenting data.
8. One or more computer-readable storage media having program instructions stored thereon to generate augmentable and spatially manipulable model images of industrial automation equipment, wherein the program instructions, when executed by a computing system, direct the computing system to at least:
receive base model data associated with an image of the industrial automation equipment;
generate a base model image by processing the base model data, wherein the base model image comprises an operational characteristic of the industrial automation equipment;
enable display of the base model image on a display system, wherein the computing system comprises the display system;
receive augmenting data from a model control unit external to the computing system;
generate an augmented model image by modifying the operational characteristic in accordance with the augmenting data; and
enable display of the augmented model image on the display system.
16. A computing apparatus comprising:
a storage device;
a display system;
a data acquisition device;
a processor operatively coupled with the storage device; and
program instructions for generating a spatially manipulable and augmentable model of industrial automation equipment, wherein the program instructions, when executed by the processor, direct the computing apparatus to at least:
receive base model data comprising spatial data defining a spatial relationship between the data acquisition device and an equipment specimen;
generate a base model image of the equipment specimen on the display system based on the base model data;
receive augmenting data from a model control unit external to the computing system, wherein the augmenting data comprises updated spatial data defining an updated spatial relationship between the data acquisition device and the equipment specimen; and
generating an augmented model image by modifying the base model image according to the augmenting data.
2. The method of
3. The method of
4. The method of
5. The method of
receiving additional augmenting data via a touch screen; and
communicating the additional augmenting data to the model control unit.
6. The method of
7. The method of
9. The one or more computer-readable storage media of
10. The one or more computer-readable storage media of
11. The one or more computer-readable storage media of
12. The one or more computer-readable storage media of
13. The one or more computer-readable storage media of
14. The one or more computer-readable storage media of
receive additional augmenting data via a touch screen; and
communicate the additional augmenting data to the model control unit.
15. The one or more computer-readable storage media of
17. The computing apparatus of
wherein the augmenting data further comprises
user-input data pertaining to changing operation of the equipment specimen.
18. The computing apparatus of
19. The computing apparatus of
acquire, via the data acquisition device, the base model data from the equipment specimen.
20. The computing apparatus of
receive additional augmenting data via the display system; and
communicate the additional augmenting data to the model control unit.
|
This application claims the benefit of, and priority to, U.S. Provisional Patent Application No. 62/212,402, entitled “AUGMENTABLE AND SPATIALLY MANIPULABLE 3D MODELING”, filed Aug. 31, 2015, and which is hereby incorporated by reference in its entirety for all purposes.
Aspects of the disclosure are related to computing hardware and software technology.
Large and/or complex equipment (e.g., machinery, environments, systems, etc. in an industrial automation environment) frequently has been demonstrated for and/or reviewed by individuals using demonstration systems (e.g., devices and software)—such demonstrations are frequently used in sales, training, troubleshooting and other common scenarios. These demonstration systems display operational and performance characteristics of the equipment using block, polygonal and other symbolic and/or representational two dimensional graphics so that various features of the equipment can be demonstrated without the equipment itself having to be present.
Equipment demonstration software applications that run on computing systems commonly provide some type of user interface to present information to the user and to receive user inputs. One example of such a system is the Allen-Bradley PanelView Plus 1000 demonstration system. Most applications typically present the user with a static list of functions on the user interface from which the user may select. For example, an operating system typically enables a user to select from menus of system functions, installed applications, and other available operations on the home screen of the operating system. Specific demonstration software can also provide various options to be selected by a user concerning the operation and performance of the equipment being considered.
However, such applications do not display these operational and performance characteristics and information in a manner that allows the user to consider how the equipment operates in a three dimensional setting, such as a factory floor or other real world operational location. Moreover, such applications fail to show the user what the equipment looks like in reality while operating under various user-controlled and user-selected conditions.
Techniques, apparatus, methods and computer program products that permit augmentable and manipulable modeling of equipment (e.g., in an industrial automation environment) are disclosed herein. In at least one implementation, a three dimensional, manipulable base model image of an equipment specimen is rendered on a display system using optical base model data obtained using a computing system. The base model data collection component of the computing system can be user-controlled to permit changing the spatial aspects of the base model image (e.g., its size, perspective, orientation in the modeling environment). Model control data such as user inputs concerning functions, conditions and the like can be transmitted from a model control unit (e.g., an industrial control system) to the computing system via a two-way communication link, generating augmenting data that is combined with the base model data to render augmented three dimensional models. User inputs also can be received directly by the computing system, for example using a touchscreen that is part of the display system. Such augmenting data received by the computing system and/or display system also can be transmitted via the communication link to the model control unit. Such implementations permit realistic simulation of the effects of a control system on a real world system, equipment, etc. The model control unit (e.g., a demonstration unit) is used to drive a realistic three dimensional model on the connected computing system (e.g., on a tablet or the like) even though the actual equipment is not present—the model control unit and computing system communicate to drive the modeling application. As the computing system (or at least its data acquisition device, such as a camera) is moved relative to the base model data source, corresponding visual changes can be made to the model, reproducing the effect of being in the same real space as the equipment being modeled.
This Overview is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. It should be understood that this Overview is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
Many aspects of the disclosure can be better understood with reference to the following drawings. While several implementations are described in connection with these drawings, the disclosure is not limited to the implementations disclosed herein. On the contrary, the intent is to cover all alternatives, modifications, and equivalents.
The following description and associated figures teach the best mode of the invention. For the purpose of teaching inventive principles, some conventional aspects of the best mode may be simplified or omitted. The following claims specify the scope of the invention. Note that some aspects of the best mode may not fall within the scope of the invention as specified by the claims. Thus, those skilled in the art will appreciate variations from the best mode that fall within the scope of the invention. Those skilled in the art will appreciate that the features described below can be combined in various ways to form multiple variations of the invention. As a result, the invention is not limited to the specific examples described below, but only by the claims and their equivalents.
Implementations disclosed herein provide for displaying augmentable three dimensional modeling of an equipment specimen or the like. Applications typically present users with a graphic representation of functions, conditions and operations of equipment being demonstrated, reviewed, etc. When the equipment specimen is sufficiently large and thus not portable, or is otherwise unavailable for demonstrating its operation, two dimensional graphic depictions of operational characteristics, performance, features and the like have typically been used. Frequently this provides the user with less information than desired or necessary for evaluation of the equipment specimen. Apparatus, techniques, methods, systems disclosed herein help to enhance the demonstrative value of user interaction and evaluation by providing an augmentable three dimensional model that permits not only viewing of a three dimensional demonstration of the equipment specimen, but manipulation of the augmented model to allow the user full inspection of the model in operational conditions.
In at least one implementation, a computing system acquires target image data from a target image source and receives control data from a model control unit, a display device, a user interface and/or other appropriate control data source that can provide augmenting data. The target image data can include base model data that relates to an equipment specimen comprising one or more machines or the like, or any other industrial asset(s) in an industrial automation environment. Depending on the equipment specimen selected and acquired by the computer system, various control options are presented to a user via the model control unit. Selection of one or more control options generates augmenting data that is received and processed by the computer system to generate an augmented model of the equipment specimen (i.e., a model showing modifications to a base model) on a computer system display system.
Subsequent changes to the augmenting data and/or further inputs of augmenting data alter the augmented model. Likewise, user interaction with the augmented model (e.g., via a touchscreen or the like on the computer system, or changes of the relative position of a base model data acquisition device relative to a base model data source) generates changes in the presentation of the equipment specimen.
Movement of the target image source and/or the target image capturing device alters the spatial presentation of the base model to which the augmenting data is applied (e.g., by providing updated spatial data that modifies the spatial data originally provided as a component of base model data used to render an image of the base model). Therefore, a user can move the base model data acquisition device about the base model data source just as a person standing in the same space as a real equipment specimen could walk around the equipment (and/or move the equipment to view different perspectives of the equipment at different distances/orientations). Likewise, movement of the base model data source can change the three dimensional base model's position, size and/or orientation. Thus the augmented model changes and/or updates dynamically as user interaction and selections are implemented.
Referring to the drawings,
Turning to
Computing system 100 also includes an optical data acquisition device 102 (e.g., a camera or other reader that can be built in or otherwise mounted to computing system 100). When optical data acquisition device 102 first acquires the base model data 122, that optical data is processed by computing system 100 to generate a three dimensional base model image 104, as seen in
Moreover, as the optical data acquisition device 102 and base model data source 120 move angularly relative to one another, the orientation or perspective view of base model image 104 can likewise change, as seen in
In the exemplary base model image 104 of
A model control unit 130 is coupled to computing system 100 via communication link 140. In some implementations model control unit 130 and computing system 100 as illustrated in the Figures can be considered a single computing system implementing processes and methods described herein. Link 140 can be a single element or component, or it can be composed of multiple segments, devices, etc. that provide for appropriate signal processing, communication bridging and the like between model control unit 130 and computing system 100. The communication link 140 can connect local and/or remote model control units and can permit two-way communication between the model control unit 130 and the computing system 100. In implementations using model control units 130 that are providing demonstrations and/or other interactive activity, as shown in
Model control unit 130 provides options for users to control, configure and operate an augmentable and spatially manipulable 3D base model image of an equipment specimen. Some types of demonstration equipment have been used for demonstrating equipment operation without the need for the equipment being present. However, such demonstrations utilized two dimensional graphics and images that provided limited realism for users operating the demonstration equipment. Using implementations disclosed herein, operational and other selections implemented by users of such demonstration equipment generate data communicated (either directly or after suitable processing) to the computing system 100 to generate augmenting data (e.g., where augmenting data can in some implementations include spatial changes (moving a camera or mobile device that is used to receive base model data), and operational changes (user inputs to change equipment operation)) that can be combined with the 3D base model image 104 to illustrate how an equipment specimen actually operates in a real world environment.
For example, in some implementations, one or more of which are shown in
To demonstrate jamming of the rotating drum, for example, a “drum jam” option on model control unit 130 (or on computing system 100) can then be selected, generating a jam signal 134 that can be used internally within model control unit 130 (e.g., again as it would be in a standard demonstration device), but which also is sent via communication link 140 to computing system 100. Jam signal 134 is processed to produce supplemental augmenting data that updates augmented model image 152 to create an updated augmented model image 154 in which rotating drum 105 has jammed (as indicated by drum rattling 103 and by smoke 106 in
If a user wants to see what alarm options are available for a drum jam scenario, an “alarm” option (or a plurality of alarm options) can then be selected on model control unit 130 (or computing system 100), generating an alarm signal 136 that can be used internally within model control unit 130 (e.g., again as would be done in a standard demonstration device), but which also is sent via communication link 140 to computing system 100. Alarm signal 136 is processed to produce augmenting data that again updates augmented model image 156 to create an updated augmented model image 156 in which rotating drum 105 has jammed and one or more alarms have been triggered (as indicated by illumination of the warning light on tower 108 and the warning message appearing on screen 107 of the equipment specimen in
If a user wants to see the equipment specimen without the smoke 106 obscuring part of the view of the machinery, then the user can delete the smoke 106 by selecting that option on computing system 100. As seen in
Turning now to
Computing system 500 may be representative of any computing apparatus, system, or systems on which application 526 and display processes 480, 490 or variations thereof may be suitably implemented. Examples of computing system 500 include mobile computing devices, such as cell phones, smartphones, tablet computers, laptop computers, wearable computing devices, notebook computers, and gaming devices, as well as any other type of mobile computing devices and any combination or variation thereof. Note that the features and functionality of computing system 500 may apply as well to desktop computers, server computers, and virtual machines, as well as any other type of computing system, variation, or combination thereof. In implementations utilizing these types of devices, components such as an optical data acquisition device may or may not be integral to the device.
Computing system 500 includes processing system 501, storage system 504, software 506, communication interface 511, user interface 513, and optical data acquisition interface 515. Processing system 501 is operatively coupled with storage system 504, communication interface 511, user interface 513, and optical data acquisition interface 515. User interface 513 can include one or more implementations of means for user interaction with computing system 500, including a touchscreen as part of display system 510. Other user interface interaction means can include a keyboard, mouse, stylus, voice command system and others.
Processing system 501 loads and executes software 506 from storage system 504. When executed by computing system 500 in general, and processing system 501 in particular, software 506 directs computing system 500 to operate as described herein for display processes 480, 490 or variations thereof, including descriptions of processes and operations relating to
Referring still to
Storage system 504 may comprise any computer-readable media or storage media readable by processing system 501 and capable of storing software 506. Storage system 504 may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer-readable instructions, data structures, program modules, or other data. Storage system 504 may be implemented as a single storage device but may also be implemented across multiple storage devices or sub-systems co-located or distributed relative to each other. Storage system 504 may comprise additional elements, such as a controller, capable of communicating with processing system 501. Examples of storage media include random access memory, read only memory, magnetic disks, optical disks, flash memory, virtual memory and non-virtual memory, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and that may be accessed by an instruction execution system, as well as any combination or variation thereof, or any other type of storage media. In no case is the storage media a propagated signal.
In operation, in conjunction with user interface 513, processing system 501 loads and executes portions of software 506, such as display processes 480, 490, to render a base model image, an augmented model image, and/or a graphical user interface for application 526 for display by display system 510 of user interface 513. Software 506 may be implemented in program instructions and among other functions may, when executed by computing system 500 in general or processing system 501 in particular, direct computing system 500 or processing system 501 to identify an organizational role of a user of computing system 500. Software 506 may further direct computing system 500 or processing system 501 to determine a set of tasks for the user based on the organizational role of the user. Software 506 may further direct computing system 500 or processing system 501 to generate a base model image or to generate an augmented model image based on a combination of a base model image and augmenting data based on inputs, data and other information. Finally, software 506 may direct changes and updates being received by computing system 500 or being transmitted by computing system 500, based on how augmenting data is received and processed.
Software 506 may include additional processes, programs, or components, such as operating system software or other application software. Examples of operating systems include Windows®, iOS®, and Android®, as well as any other suitable operating system. Software 506 may also comprise firmware or some other form of machine-readable processing instructions executable by processing system 501.
In general, software 506 may, when loaded into processing system 501 and executed, transform computing system 500 overall from a general-purpose computing system into a special-purpose computing system customized to facilitate displaying tasks for one or more users and/or one or more model control units as described herein for each implementation. For example, encoding software 506 on storage system 504 may transform the physical structure of storage system 504. The specific transformation of the physical structure may depend on various factors in different implementations of this description. Examples of such factors may include, but are not limited to the technology used to implement the storage media of storage system 504 and whether the computer-readable storage media are characterized as primary or secondary storage.
In some examples, if the computer-readable storage media are implemented as semiconductor-based memory, software 506 may transform the physical state of the semiconductor memory when the program is encoded therein. For example, software 506 may transform the state of transistors, capacitors, or other discrete circuit elements constituting the semiconductor memory. A similar transformation may occur with respect to magnetic or optical media. Other transformations of physical media are possible without departing from the scope of the present description, with the foregoing examples provided only to facilitate this discussion.
It should be understood that computing system 500 is generally intended to represent a computing system with which software 506 is deployed and executed in order to implement application 526 and/or display processes 480, 490 (and variations thereof, including processes and operations relating to
Communication interface 511 may include communication connections and devices that allow for communication between computing system 500 and other computing systems (not shown) or services, over a communication link 540 (including a network) or collection of networks. In some implementations, communication interface 511 receives augmenting data from a model control unit 530 over communication link 540. As seen in
User interface 513 may include a voice input device, a touch input device for receiving a gesture from a user, a motion input device for detecting non-touch gestures and other motions by a user, and other comparable input devices and associated processing elements capable of receiving user input from a user. Output devices such as display system 510, speakers, haptic devices, and other types of output devices may also be included in user interface 513. Moreover, input and output capabilities may be combined in one or more devices or features of computing system 500. The aforementioned user input devices are well known in the art and need not be discussed at length here. User interface 513 may also include associated user interface software executable by processing system 501 in support of the various user input and output devices discussed above. Separately or in conjunction with each other and other hardware and software elements, the user interface software and devices may provide a graphical user interface, a natural user interface, or any other kind of user interface, including interfaces integral to the presentation of a base model image and/or augmented model image.
The functional block diagrams, operational sequences, and flow diagrams provided in the Figures are representative of exemplary architectures, environments, and methodologies for performing novel aspects of the disclosure. While, for purposes of simplicity of explanation, methods included herein may be in the form of a functional diagram, operational sequence, or flow diagram, and may be described as a series of acts, it is to be understood and appreciated that the methods are not limited by the order of acts, as some acts may, in accordance therewith, occur in a different order and/or concurrently with other acts from that shown and described herein. For example, those skilled in the art will understand and appreciate that a method could alternatively be represented as a series of interrelated states or events, such as in a state diagram. Moreover, not all acts illustrated in a methodology may be required for a novel implementation.
The above description and associated drawings teach the best mode of the invention. Various technical effects will be appreciated based on the foregoing—for example, improved modeling including combining inputs from a connected model control unit and the display system and the ability to update the augmented model both with regard to operational conditions and features, as well as the manipulable spatial size, perspective and orientation of the three dimensional model, including improved control and evaluation of such models. The following claims specify the scope of the invention. Some aspects of the best mode may not fall within the scope of the invention as specified by the claims. Also, while the preceding discussion describes embodiments employed specifically in conjunction with the monitoring and analysis of industrial processes, other applications, such as the mathematical modeling or monitoring of any man-made or naturally-existing system, may benefit from use of the concepts discussed above. Further, those skilled in the art will appreciate that the features described above can be combined in various ways to form multiple variations of the invention. As a result, the invention is not limited to the specific embodiments described above, but only by the following claims and their equivalents.
Capozella, Michael, Griesmer, Paul
Patent | Priority | Assignee | Title |
Patent | Priority | Assignee | Title |
7564469, | Aug 29 2005 | NANT HOLDINGS IP, LLC | Interactivity with a mixed reality |
8487962, | Mar 06 2006 | D4D Technologies, LLC | Augmented reality system for a dental laboratory |
8718612, | Mar 08 2011 | Bank of American Corporation | Real-time analysis involving real estate listings |
8803916, | May 03 2012 | T-MOBILE INNOVATIONS LLC | Methods and systems for an augmented reality service delivery platform |
20110164114, | |||
20130135295, | |||
20140002493, | |||
20140043329, | |||
20150006361, | |||
20150109480, | |||
20150154322, | |||
20150161821, | |||
20150187136, | |||
20160012160, | |||
20160292920, | |||
20160327293, | |||
EP2889844, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Mar 04 2016 | CAPOZELLA, MICHAEL | ROCKWELL AUTOMATION TECHNOLOGIES, INC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 038097 | /0356 | |
Mar 24 2016 | Rockwell Automation Technologies, Inc. | (assignment on the face of the patent) | / | |||
Mar 24 2016 | GRIESMER, PAUL | ROCKWELL AUTOMATION TECHNOLOGIES, INC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 038097 | /0356 |
Date | Maintenance Fee Events |
Sep 21 2023 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Date | Maintenance Schedule |
Apr 14 2023 | 4 years fee payment window open |
Oct 14 2023 | 6 months grace period start (w surcharge) |
Apr 14 2024 | patent expiry (for year 4) |
Apr 14 2026 | 2 years to revive unintentionally abandoned end. (for year 4) |
Apr 14 2027 | 8 years fee payment window open |
Oct 14 2027 | 6 months grace period start (w surcharge) |
Apr 14 2028 | patent expiry (for year 8) |
Apr 14 2030 | 2 years to revive unintentionally abandoned end. (for year 8) |
Apr 14 2031 | 12 years fee payment window open |
Oct 14 2031 | 6 months grace period start (w surcharge) |
Apr 14 2032 | patent expiry (for year 12) |
Apr 14 2034 | 2 years to revive unintentionally abandoned end. (for year 12) |