Methods for managing errors utilizing augmented reality are provided. One system includes a transceiver configured to communicate with a systems management console, capture device for capturing environmental inputs, memory storing code comprising an augmented reality module, and a processor. The processor, when executing the code comprising the augmented reality module, is configured to perform the method below. One method includes capturing an environmental input, identifying a target device in the captured environmental input, and querying the systems management console regarding a status condition for the target device. Also provided are physical computer storage mediums including a computer program product for performing the above method.
|
1. A method for managing errors utilizing augmented reality in a system including a transceiver configured to communicate with a systems management console, a capture device for capturing environmental inputs, memory storing code comprising an augmented reality module, an input device configured to receive inputs from a user, a display, and a processor coupled to the transceiver, the capture device, the memory, the input device, and the display, the method comprising:
capturing an environmental input from a target device, the environmental input including a tactile cue, via the capture device, the target device being remote from the system management console;
identifying, via the processor, a target device in the captured tactile cue;
querying, by the processor utilizing the transceiver, the systems management console regarding a status condition of the target device;
receiving, via the transceiver, information regarding the status condition from the systems management console;
displaying, via the display, the information regarding the status condition, and launching a repair interface directly to a problem page for a determined fault condition, if the status condition is a fault condition, for a user to address the fault condition;
synthesizing the received information and the captured tactile cue;
receiving, via the input device, an input to the repair interface from the user for troubleshooting the status condition when the status condition is the fault condition;
transmitting, via the transceiver, the input to the systems management console;
receiving, via the input device an other input from the user for correcting the fault condition; and
transmitting, via the transceiver, the other input to the systems management console correcting the fault condition.
2. The method of
3. The method of
4. The method of
5. The method of
|
This application is a Continuation of U.S. patent application Ser. No. 13/082,291, filed on Apr. 7, 2011.
1. Field of the Invention
The present invention relates generally to computing systems, and more particularly to, systems and methods for managing errors utilizing augmented reality.
2. Description of the Related Art
Augmented reality is utilized primarily to passively obtain information regarding a particular location identified in a captured image. For example, a user can obtain the menu of a particular restaurant via augmented reality utilizing a captured image of the restaurant. In this example, a captured image of the exterior of the restaurant is used to identify the restaurant. Once the restaurant is identified, a user is capable of obtaining previously stored information related to the restaurant (e.g., the menu, consumer rating, location, etc.). While obtaining information via augmented reality is helpful in many situations, contemporary uses of augmented reality are limited.
Various embodiments provide systems for managing errors utilizing augmented reality. One system comprises a transceiver configured to communicate with a systems management console; capture device for capturing environmental inputs, memory storing code comprising an augmented reality module, and a processor coupled to the transceiver, the capture device, and the memory. In one embodiment, the processor, when executing the code comprising the augmented reality module, is configured to capture an environmental input via the capture device, identify a target device in the captured environmental input, and query, utilizing the transceiver, the systems management console regarding a status condition of the target device.
Other embodiments provide methods for managing errors utilizing augmented reality in a system including a transceiver configured to communicate with a systems management console, a capture device for capturing environmental inputs, memory storing code comprising an augmented reality module, and a processor coupled to the transceiver, the capture device, and the memory. One method comprises capturing an environmental input via the capture device, identifying, via the processor, a target device in the captured environmental input, and querying, by the processor utilizing the transceiver, the systems management console regarding a status condition of the target device.
Physical computer storage mediums (e.g., an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing) comprising a computer program product method for managing errors utilizing augmented reality in a system including a transceiver configured to communicate with a systems management console, a capture device for capturing environmental inputs, memory storing code comprising an augmented reality module, and a processor coupled to the transceiver, the capture device, and the memory are also provided. One physical computer storage medium comprises computer code for capturing an environmental input via the capture device, computer code for identifying, via the processor, a target device in the captured environmental input, and computer code for querying, by the processor utilizing the transceiver, the systems management console regarding a status condition of the target device.
In order that the advantages of the invention will be readily understood, a more particular description of the invention briefly described above will be rendered by reference to specific embodiments that are illustrated in the appended drawings. Understanding that these drawings depict only typical embodiments of the invention and are not therefore to be considered to be limiting of its scope, the invention will be described and explained with additional specificity and detail through the use of the accompanying drawings, in which:
The illustrated embodiments below provide systems and methods for managing errors utilizing augmented reality. Also provided are physical computer storage mediums (e.g., an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing) comprising a computer program product method for managing errors utilizing augmented reality in a system (e.g., a computing system).
Turning now to the figures,
Systems management console 150 is coupled to and configured to manage system 175. As such, systems management console 150 may be any console capable of monitoring system 175 for various statuses of operation or status of system 175. The various statuses include, but are not limited to, normal operation, an error state, a warning state, and the like status. In doing so, systems management console 150 is configured to identify any status and change in status, transmit such status to a querying device, and receive input from one or more users (via the querying device) to repair/correct/troubleshoot any identified errors and/or warnings.
System 175 may be any system and/or device capable of being monitored by systems management console 150, having any statuses identified by systems management console 150, and having any identified errors and/or warning repaired/corrected by systems management console 150. In one embodiment, system 175 comprises one or more computing devices 1752 (e.g., one or more servers, one or more storage devices, one or more power supplies, one or more blade chassis, etc.) in communication with systems management console 150.
In various embodiments, each of the one or more computing devices 1752 in system 175 comprises an identifier 1754, which may be any type of identifier known in the art or developed in the future. In one embodiment, each identifier 1754 is a bar code or other type of alphanumeric identifier. In another embodiment, each identifier 1754 is a radio frequency identifier (RFID) device (e.g., a RFID tag). In yet another embodiment, the location of the each respective computing device 1752 is the identifier 1754 for each computing device 1752. In this embodiment, the location may be with respect to a fixed object, with respect to the environment within each computing device 1752 resides, and/or with respect to a global position of each computing device. In still another embodiment, the shape, color, size, and/or other physical characteristic of each computing device 1752 is the identifier 1754 for each respective computing device 1752. In an alternative embodiment, a sound, noise, and/or other auditory cue generated by each respective computing device 1752 is the identifier 1754 for each respective computing device 1752. In yet another alternative embodiment, a vibration, a motion, and/or other tactile cue generated by each respective computing device 1752 is the identifier 1754 for each respective computing device 1752.
Each identifier 1754, in one embodiment, is the same type of identifier. For example, each identifier may be a bar code or other alphanumeric identifier that uniquely distinguishes each computing device 1752. In another embodiment, at least two computing devices 1752 include different types of identifiers 1754. For example, a first identifier 1754 on a first computing device 1752 may be a bar code and a second identifier 1754 for a second computing device 1752 may be the global position of the second computing device. Regardless of the type of identifier 1754, mobile device 200 is capable of capturing an environmental input including the identifier 1754.
With reference now to
Transceiver 210 may be any system and/or device capable of communicating (e.g., transmitting and receiving data and/or signals) with systems management console 150. As such, transceiver 210 may be any transceiver known in the art or developed in the future.
Input device(s) 220 may be any system and/or device capable of receiving input from a user. Examples of input devices 220 include, but are not limited to, a mouse, a key board, a microphone, a touch screen, and the like input devices. As such, input device(s) 220 may be input device known in the art or developed in the future. In the various embodiments, each input device 220 is in communication with display 230.
Display 230 may be any system and/or device capable of displaying data. As such, display 230 may be any display known in the art or developed in the future. In one embodiment, display 230 includes a touch screen such that display 230 and input device 220 are integrated devices. In various embodiments, display 230 is configured to display data received from systems management console 150, input device(s) 230, and one or more capture devices 240.
Capture device(s) 240 may be any system and/or device capable of capturing environmental inputs (e.g., visual inputs, audio inputs, and tactile inputs). Examples of capture devices 240 include, but are not limited to, a camera, a microphone, a global positioning system (GPS), a gyroscope, a plurality of accelerometers, and the like capture devices. As such, capture device(s) 240 may be any capture device known in the art of developed in the future. In one embodiment, capture device 240 is a camera configured to capture images of the environment surrounding mobile device 200.
Memory 250 may be any system and/or device capable of storing data. In one embodiment, memory 250 stores computer code comprising an augmented reality module 2510. Augmented reality module 2510 comprises instructions that, when executed by processor 260, causes processor 260 to perform a method of managing errors in system 175.
Processor 260 is configured to execute the computer code comprising augmented reality module 2510. When executing augmented reality module 2510, processor 260 is configured to receive and process a captured environmental input representing at least a portion of system 175 from capture device 240.
In processing the captured environmental input, processor 260 is configured to identify one or more target devices in system 175 that are represented in the captured environmental input. For example, if the captured environmental input is an image of at least a portion of system 175, processor 260 is configured to identify one or more target devices in the captured image.
Processor 260 is configured to identify each target device utilizing the identifier 1754 for each respective target device. For example, in a captured image of system 175, processor 260 is configured to identify each target device via a bar code and/or other visual cue(s). In another example, in a captured audio clip of system 175, processor 260 is configured to identify each target device via a sound, noise, and/or other audio cue(s). In still another example, in a captured tactile bit of system 175, processor 260 is configured to identify each target device via a motion, vibration, and/or other tactile cue(s).
After the target device(s) is/are identified, processor 260 is configured to query systems management console 150 regarding the identified target device(s). In response thereto, processor 260 is configured to receive from systems management console 150 one or more status conditions and overlay the status condition(s) on the captured environmental input. For example, if the captured environmental input is an image of the target device(s), processor 260 is configured to overlay one or more status conditions on the image or portions of the image representing one or more components of the target device(s).
Furthermore, processor 260 is configured to present to a user cues (audio cues, visual cues (e.g., a hyperlink), tactile cues, etc.) for accessing and launching a repair interface 2610 (see
In one example, if the power indicator on system 175 is “OFF,” processor 260 will receive from systems management console 150 the reason the power is “OFF” on the target device. In this example, systems management console 150 may indicate that the main power supply is experiencing problems and/or present a error log and instruct processor 260 to display the problem and/or error log to the user. Processor 260 will then display the problem and/or error log on display 230 and launch repair interface 2610 so that the user is able to address the error condition.
Repair interface 2610 enables the user to remotely utilize systems management console 150 to repair/correct the determined fault condition. To accomplish this, the user provides inputs (e.g., via input device(s) 220) to repair interface 2610. Processor 260 is configured to then transmit (e.g., via transceiver 210) the inputs to systems management console 150, which then repairs/corrects the determined error condition in system 175.
In continuation to the above example, the user may instruct (e.g., via inputs to repair interface 2610) systems management console 150 to initiate an auxiliary power supply to the target device. In response thereto, systems management console 150 will initiate the auxiliary power supply to correct the determined fault condition.
In one embodiment, mobile device 200 is a cellular telephone (e.g., a “smart” phone). In other embodiments, mobile device 200 is a computing tablet, a notebook computing device, a netbook computing device, a laptop computing device, and/or the like computing device.
Turning now to
In processing the captured environmental input, method 400 comprises identifying one or more target devices in the system that are represented in the captured environmental input (block 410). In one embodiment, method 400 utilizes a captured image of the system to identify each target device. In another embodiment, method 400 utilizes a captured audio clip of the system to identify each target device. In still another embodiment, method 400 utilizes a captured tactile bit of the system to identify each target device.
After the target device(s) is/are identified, method 400 comprises querying a systems management console (e.g., systems management console 150) regarding the status in the identified target device(s) (block 415). In response thereto, method 400 comprises receiving from the systems management console one or more status conditions for the target device(s) (block 420).
Method 400 further comprises determining if the target device(s) is/are experiencing an error and/or warning condition (block 425). If the status indicates that the target device(s) is/are not experiencing an error/warning condition (i.e., is/are functioning properly), the user is informed (block 430) and method 400 ends. If the status indicates that the target device(s) includes an error/warning condition, method 400 includes launching a repair interface (e.g., repair interface 2610) (block 435).
The target of the repair interface can be at a user interface (e.g., a display) on the mobile device and/or at user interface elements on the device(s) and/or system being monitored (e.g., system 175). For example, display elements on the display (e.g., brightness, flash rate frequency, audio volume, and the like) could be modified/enhanced to further augment the diagnostic task.
After the repair interface is launched, method 400 comprises receiving inputs (e.g., from a user) to the repair interface from the user (block 440). The inputs are then transmitted to the systems management console (block 445), which then repairs/corrects the determined fault condition in system in accordance with the received inputs. Method 400 then ends.
While at least one exemplary embodiment has been presented in the foregoing detailed description of the invention, it should be appreciated that a vast number of variations exist. It should also be appreciated that the exemplary embodiment or exemplary embodiments are only examples, and are not intended to limit the scope, applicability, or configuration of the invention in any way. Rather, the foregoing detailed description will provide those skilled in the art with a convenient road map for implementing an exemplary embodiment of the invention, it being understood that various changes may be made in the function and arrangement of elements described in an exemplary embodiment without departing from the scope of the invention as set forth in the appended claims and their legal equivalents.
As will be appreciated by one of ordinary skill in the art, aspects of the present invention may be embodied as a system, method, or computer program product. Accordingly, aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module,” or “system.” Furthermore, aspects of the present invention may take the form of a computer program product embodied in one or more computer-readable medium(s) having computer readable program code embodied thereon.
Any combination of one or more computer-readable medium(s) may be utilized. The computer-readable medium may be a computer-readable signal medium or a physical computer-readable storage medium. A physical computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, crystal, polymer, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Examples of a physical computer-readable storage medium include, but are not limited to, an electrical connection having one or more wires, a portable computer diskette, a hard disk, RAM, ROM, an EPROM, a Flash memory, an optical fiber, a CD-ROM, an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer-readable storage medium may be any tangible medium that can contain, or store a program or data for use by or in connection with an instruction execution system, apparatus, or device.
Computer code embodied on a computer-readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wired, optical fiber cable, radio frequency (RF), etc., or any suitable combination of the foregoing. Computer code for carrying out operations for aspects of the present invention may be written in any static language, such as the “C” programming language or other similar programming language. The computer code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, or communication system, including, but not limited to, a local area network (LAN) or a wide area network (WAN), Converged Network, or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
Aspects of the present invention are described above with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer-readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks. The computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the above figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
While one or more embodiments of the present invention have been illustrated in detail, one of ordinary skill in the art will appreciate that modifications and adaptations to those embodiments may be made without departing from the scope of the present invention as set forth in the following claims.
Molander, Mark E., Meserth, Timothy A., Windell, David T.
Patent | Priority | Assignee | Title |
Patent | Priority | Assignee | Title |
6774869, | Dec 22 2000 | Board of Trustees Operating Michigan State University; CENTRAL FLORIDA, UNIVERSITY OF | Teleportal face-to-face system |
7372451, | Oct 19 2001 | Accenture Global Services Limited | Industrial augmented reality |
7907901, | Sep 13 2007 | HUAWEI TECHNOLOGIES CO , LTD | Method and apparatus to enable pairing of devices |
20070093955, | |||
20080008202, | |||
20090167509, | |||
20090167919, | |||
20100030493, | |||
20110055049, | |||
20110254671, | |||
20120249588, | |||
CN101101505, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Jun 20 2012 | International Business Machines Corporation | (assignment on the face of the patent) | / |
Date | Maintenance Fee Events |
Date | Maintenance Schedule |
Jan 13 2018 | 4 years fee payment window open |
Jul 13 2018 | 6 months grace period start (w surcharge) |
Jan 13 2019 | patent expiry (for year 4) |
Jan 13 2021 | 2 years to revive unintentionally abandoned end. (for year 4) |
Jan 13 2022 | 8 years fee payment window open |
Jul 13 2022 | 6 months grace period start (w surcharge) |
Jan 13 2023 | patent expiry (for year 8) |
Jan 13 2025 | 2 years to revive unintentionally abandoned end. (for year 8) |
Jan 13 2026 | 12 years fee payment window open |
Jul 13 2026 | 6 months grace period start (w surcharge) |
Jan 13 2027 | patent expiry (for year 12) |
Jan 13 2029 | 2 years to revive unintentionally abandoned end. (for year 12) |