An interactive control system with plural displays comprising a plurality of first interactive control apparatus having input device and output device and a second interactive control apparatus connected to the first interactive control apparatus, characterized in having a registering device for registering an attribute of the first interactive control apparatus, a process executing device for executing an input process from the input device of the first interactive control apparatus to the second interactive control apparatus based on the attribute, and an output selecting device for outputting an executing result of the input process by said process executing device to the output device of the first interactive control apparatus corresponding to the attribute.
|
1. An interactive control system with plural displays comprising
a plurality of first interactive control apparatuses having input means and output means, a second interactive control apparatus having a display means, a see-through display apparatus being used as one of the output means of said first interactive control apparatus, through which see-through display the display of the second interactive control apparatus is visible, and a position pointing means for designating a position on a display of the display means of said second interactive control apparatus using a pointer which is displayed on the see-through display of the see-through display apparatus of said first interactive control apparatus, said position being determined by the position where said pointer is visible on the display of the display means of said second interactive control apparatus in a superimposed manner.
|
This is a continuation of U.S. patent application Ser. No. 09/374,263, filed Aug. 16, 1999, now U.S. Pat. No. 6,100,857, which is a continuation of U.S. patent application Ser. No. 08/969,313, filed Nov. 13, 1997, which issued as U.S. Pat. No. 5,969,697, which issued Oct. 19, 1999, which is a continuation of U.S. patent application Ser. No. 08/230,369, filed Apr. 20, 1994 which is now abandoned.
(1) Field of the Invention
The present invention relates to an interactive processing apparatus for interactive processing of plural displays, and a method thereof.
(2) Description of the Prior Art
Current monitoring and controlling systems have a large display installed in front of operators in order to display overview information such as a system configuration map of a total system, alarms for indicating that something unusual is occurring, for allowing all the operators to grasp a condition of the system at a glance at any time. On the other hand, a display at hand prepared for respective operators displays integrally more detailed information. The amount of the detailed information displayed on each display at hand is numerous, and it is not rare to reach hundreds of images in a large scale system.
The operators monitor using both the large display and their display at hand. The operators grasp entire system states by watching the overview information on the large display, and when an abnormal condition is detected, they examine more detailed data by using their own display at hand and perform necessary control operation.
However, because information displayed on the large display and information shown in displays at hand are independently controlled, a conventional system required a complex operation to provide the necessary information in connection with each other. For instance, when a warning lamp blinks on the large display, the operators must retrieve an image displaying control data for the warning from hundreds of images by selecting menu repeatedly. Therefore, there has been a problem of a delayed response to an emergency such as an abnormal condition occurrence or an accident.
(1) Objects of the Invention
One of the objects of the present invention is to provide a man-machine interface which is capable of referring related detailed information just by designating an objective on the large display. For instance, a man-machine interface such as the one wherein detailed information on a warning and control data related to it are displayed in a display at hand just by pointing to a blinking warning on the large display, and control data and setting devices related to an apparatus are displayed on a display at hand only by pointing to the apparatus in a system configuration map on the large display.
When realizing such a man-machine interface as the one above described, an important point to be considered is that the large display is shared by a plurality of operators. A monitoring and controlling system is operated by collaboration of plural operators, each of them is in charge of a different operation respectively, such as an operator in charge of operation, an operator in charge of maintenance and inspection, and a chief on duty for controlling total operation. Accordingly, the large display is shared by operators who perform different tasks simultaneously, which is different from a case of display at hand which is prepared for individual operators. Therefore, the above described interface must satisfy the following requirements.
(1) No Disturbance to Other Operator's Operation
There is a possibility to hide information which has been watched by other operators when information necessary for only a specified operator is displayed arbitrarily on the large display.
(2) Simple Retrieval of Information Necessary for Individual Tasks by Respective Operators
Necessary information differs depending on contents of the charged task. For example, when a warning light indicating an abnormal condition of a boiler blinks, an operator in charge of operation examines control data such as a flow rate of fuel, while an operator in charge of maintenance examines an inspection record of the boiler. Accordingly, it is necessary for operators to be able to quickly retrieve information necessary for them without being distracted by information for others.
(3) An Operating Environment Suitable for Tasks Assigned to Each Operator
Commands used frequently and permission for operation differ depending on the task changed to respective operator. Accordingly, it is desirable that the operating environment such as a structure of menu and an operable range of operation can be customized for respective operators.
The object of the present invention is to provide a man-machine interface which satisfies the above requirements.
(2) Methods of Solving the Problems
In accordance with the present invention, the above described objects can be realized by providing a registering means for registering an attribute of a respective operator to an input-means, a process selecting means for selecting process contents based on the attribute responding to a process request from the input means, and an executing means for executing a process selected by the process selecting means and outputting to an output means selected based on the attribute, to an interactive processing apparatus having a plurality of input means and a plurality of output means.
An operator registers his own attribute, for example, charged task, etc, to his operating input means using the registering means. When the operator requests a process for displaying related information and menu from the input means, the process selecting means examines the operator's attribute which has been registered in the input means, and selects a process corresponding to the attribute. The executing means executes the process selected by the process selecting means, and outputs a result of the execution to an output device matched to the attribute, for example, a display at hand of the operator. In accordance with the execution of the process based on the operator's attribute, displaying only necessary images for the operator and providing convenient operating environment for the operator to operate in, can be realized. Furthermore, the operator can execute a necessary process without disturbing other operators' operation by selecting an output device based on the operator's attribute.
Embodiments of the present invention are explained hereinafter with reference to the drawings.
In
A concept of the present invention is explained referring to FIG. 22. In
To the workstation 250, an attribute is registered at the start of the operation. The attribute registration processing part 257 displays a menu for the attribute registration in the output apparatus 256 via the input/output processing part 252. When the menu is selected by the input apparatus 255, the attribute registration processing part 257 registers the selected attribute to the attribute memory part 253, and further, stores a corresponding relationship between the selected attribute and the output apparatus 256 to the attribute memory part 203 of the workstation 200 via the network 230.
Input information from the input apparatus 255 is transmitted to the process executing part 251 via the input/output processing part 252. The process executing part 251 executes a responding process and outputs results of the execution to the output apparatus 256 when the input information is such as to designate a position in a display of the output apparatus 256. On the other hand, when the input information is such as designating a position in a display of the output apparatus 205, the process executing part 251 transmits the input information with an attribute called out from the attribute memory part 253 to the process executing part 201 of the workstation 200. The process executing part 201 executes a process corresponding to the transmitted input information and the attribute. The process executing part 201 transmits results of the execution to an output apparatus corresponding to the transmitted attribute which is called out from the attribute memory part 203.
Next, a method for transferring the pointer 15 between the large display 1 and the display at hand 10 is explained referring to
When the large display is composed of a plurality of displays, the display position of the pointer 15 is decided in the same manner as explained referring to
Referring to
The image size (the number of pixels) of the pointer 15 may be changed in the large display 1 from that in the display at hand 10, especially, when the display at hand 10 and the large display are installed far apart. Making the display size of the pointer 15 larger facilitates identification. For instance, the pointer 15 is displayed with pixels 16×16 on the display at hand 10, and with pixels 36×36 in the large display 1. According to the above selection, the pointer becomes easily recognizable even in the far away large display.
Advantages of the above described method are as follows:
(1) a position on the display at hand 10 and the large display 1 can be designated continuously without changing grip of a pointing device,
(2) interactive operation of the large display can be performed with the same feeling as that of operation of the display at hand 10,
(3) depending on the above advantage (2), learning of operation is easy.
At the start of operation of a system in the present embodiment, an operator registers his own charged task to the system. The system provides service based on the registered charged task. The service includes, for example, arranging a suitable operation environment for the charged task, facilitating to retrieve information only necessary for the charged task, and setting a permission for operation for each of the charged task. Here, the operation environment means items in a menu, an order of its arrangement, setting of default, and setting a permission for operation, etc.
A method for registering the charged task is explained hereinafter with reference to
Operation: Selected when a service for a person in charge of operation is desirable.
Inspection: Selected when a service for a person in charge of inspection is desirable.
Chief on duty: Selected when a service for a person in charge of chief on duty is desirable.
Supervisor: Selected when a service for a person responsible for all task is desirable. This item is selected, for example, at adjusting the system, or operating the system by only one person.
General: Selected for a service within a range of task which does not cause serious disturbance to the system even though erroneous operation is executed. This item is selected, for example, when the system is operated by a person who is not familiar with the system for on-job training.
When a desired item in the charged task selecting menu 18 is selected by the mouse 12 (step 101), a password input region 19 is displayed. When a password which is designated to each of the charged tasks is input (step 102), the charged task for the operator is registered to the system, and the registered charged task is displayed in the charged task icon 16. When the charged task must be changed, the same procedure as that of the registering (
In the system of the present embodiment, a correspondence of the charged task for the operator and input/output devices used by the operator is controlled using tables shown in
In
Referring to
When an operator points out an image displayed on the large display 1 by a mouse at hand, detailed information related to the pointed image and necessary for the task charged to the operator is displayed in the operator's displays at hand 10, 30, 50.
An error message to an erroneous operation on the large display 1 is also output to the display at hand 10 or the headset 14 only for the operator who has operated. Of course, the error message which must be referred to other operators is output to the other operators.
As explained above, information related to the displayed information on the large display 1 can be referred easily by pointing out the display by the mouse 12 at hand. And, the information is output only to the output device for the operator who has pointed out the display, and consequently, the operation does not distract other operators. The large display is used commonly with many operators. Therefore, if information necessary for only a specified operator is displayed on the large display 1, it may hide information which has been watched by other operators. In a case when a sound is output, if the sound is output loudly so as to reach all operators, it may distract operators who do not need the information. Furthermore, by displaying only information selected so as to correspond to the charged task for the operator who has pointed out, the operator can easily access the information necessary for only himself without being distracted by information to other operators.
The large display 1 is used commonly by a plurality of operators who are in charge of different tasks. Because a suitable operation environment for performing each of the tasks differs, the plant monitoring and controlling system 91 provides an operation environment corresponding to the charged task for each of the operators who use the large display 1 for interactive operation.
In the above description, a case when some information is output to the operator corresponding to the task of the operator is explained. However, there may be cases wherein information, such as warning, is output to the operator by the system voluntarily. Even in this case, the information is output only to the operator in charge of the task which needs the information. For example, warning sound to generate warning which can be treated only by the operator in charge of operation is output only to the headset 14 of the operator in charge of operation.
In a case when the large display 1 is too large to be within the operator's visual field, there may be a possibility that the operator fails to be aware of information which is displayed out of his visual field. To prevent such a case from occurring, a sound is output simultaneously with the display on the large display 1 so as to indicate the display position. The operator becomes aware of displaying new information by the sound without watching the large display 1. Further, because the sound is output so as to indicate the display position, the operator can be aware of the approximate position of the displayed information. When the sound is output simultaneously with the display, the sound is output to only the operator in charge of the task which requires the displayed information. For instance, when information relating to operation is displayed, the sound is output to only the headset 14 for the operator in charge of operation.
Referring to
The icon 24 can be placed at an arbitrary position of the display at hand 10 and the large display 1 by dragging. The dragging of the icon 24 can be performed by moving the pointer 15 on the icon 24 and subsequent moving of the mouse 12 with pushing, for example, a left button of the mouse 12.
Referring to
A program for realizing the system 19 can be composed so as to be executed in any one of the workstations 2, 11, 31, 51, or in any several or all of the workstations 2, 11, 31, 51.
In a corresponding table 130 of events/executing process which is controlled per every display object shown in
(1) Button push down: Kinds of events generated by pushing down a mouse button.
(2) Button release: Kinds of events generated by releasing a mouse button.
(3) Key push down: Kinds of events generated by pushing down a key in the keyboard.
(4) Key release: Kinds of events generated by releasing a key in the keyboard.
The button number designates the button or the key which has generated the event. The person in charge designates charged task of the operator who has generated the event.
The executing process can be divided into two categories such as routine and output. The routine stores a process to be executed when an event is generated, and output designates an output apparatus which is used by the operator who must receive the output. The above designation of the operator is performed by designating a kind of task charged to the operator. That means, when an operator in charge of operation is designated as a destination of an output, the output is transferred to the output apparatus which is used by the operator in charge of operation.
Referring to
In the event of input step 140, input event cues of the workstations 11, 31, 51 are examined. If the event is stored in the input event queue, the event is taken out. The event includes information such as an input device ID which generates the event, a button number which generates the event, and a location where the event is generated. In the step 141, a table 120 (FIG. 9) is searched using the input device ID of the taken out event as a key to retrieve charged task of the operator who has generated the event. In the step 142, a displayed object in the location where the event is generated is searched based on the event generated location.
If no displayed object exists at the event generated location (step 143), the operation returns to the step 140 and continues to process the next input event. If any displayed object exists at the event generated location (step 143), the operation goes to step 144. In step 144, the input event items in the corresponding table 130 of the event/executing process for the displayed object which is searched in step 142 is examined whether any input items are matched with kind of operation, button number, and person in charge of the input event. If there are any event items matched with the input event (step 145), an output destination of corresponding executing process items is taken out, the table 121 (
When no event item matching to the input event is found in step 144 (step 145), the operation returns to the step 142, and other object at the event generated location is searched. The above described processing is repeated until the plant monitoring and controlling system 91 is ended (step 148).
Referring to
The explanation is performed taking the display at hand 10 as an example, but cases of the other displays at hand 30, 50 are entirely the same. In
The coordinate values renewed in the workstation 11 corresponding to input by the mouse 12 are expressed as (curx, cury). When the mouse 12 is moved, an amount of moving (dx, dy) is reported to the workstation 11, and the (curx, cury) is renewed by the following equation;
where,
That means, if a result of the renewal exceeds the region defined by the equation (2), the (curx, cury) is set as a value at a boundary. For example, -2 for cury is obtained by executing the equation (1), cury is set as 0. Origin of the coordinate on the large display 1 and the display at hand 10 is assumed to be located at top-left.
Referring to
In the above embodiment, the charged task for the operator is registered first. And, by controlling the corresponding relationship between the registered charged task and input/output device, information corresponding to the charged task is displayed and operation environment is set. However, any attribute of the operator other than the charged task can be usable. For instance, name, age, order, class, rank, sex, mother language, skillfulness can be used registration for the control. Further, not only one attribute, a several attributes connected by logical equations can be used for the registration. In accordance with the above variation, the service can be provided with contents matched with various attribute of the operator.
Further, in the above embodiment, a method to register the attribute of the operator by selecting the menu is used. However, the attribute of the operator may be recognized by the plant monitoring and controlling system 91 itself. For instance, the operator sitting in front of the display 10 may be recognized by the operator's face, or by the operator's voiceprint input from a microphone.
Furthermore, in the above embodiment, the attribute of the operator is registered at the beginning of the operation. However, the attribute of the operator may be asked (a menu for selecting the attribute is displayed), or a processing for recognizing the attribute may start, at a moment when the system needs to know the attribute of the operator.
In the above embodiment, a mouse is used for pointing on the large display 1, but a laser beam pointer can also be used. A pointing position on the large display 1 is determined by taking video with a video camera in front or back of the large display screen and processing the video image for determining position of the laser beam. Recognition of the device ID when a plurality of laser pointers are used is performed by using the laser pointers each having a laser beam of different color, and determining the color of the laser beam. Similarly, infrared pointers can be used. In this case, devices can be recognized by using different frequencies of infrared each other.
In the above embodiment, the pointer 15 moves between the display at hand 10 and the large display 1 as if the upper side of the display at hand 10 and a whole or a part of the lower side of the large display 1 were connected. However, as shown in
In the above embodiment, a conventional display apparatus is used as for the display at hand 10. However, a see-through display apparatus can be used as for the display at hand 10. The see-through display is a translucent display, and information displayed on it is visible with a background of the display in a superimposed manner. As one example of such see-through display apparatus, there is a see-through head-mounted display described in a reference, November, (1991) ACM Press pp. 9-17.
Referring to
Naturally, a relationship between display coordinates of the see-through displays 1100, 1200 and of the large display 1000 is maintained constant. That means, in such a case as mounting a see-through display at the overhead of the operator, a 3D tracking system is used for tracking the position and orientation of the see-through display, and the display coordinates of the see-through display are corrected in connection with a relationship with a relative position of the large display 1000.
Furthermore, a see-through display and a conventional display can be used together as for displays at hand. That means, information which is desired to be displayed in a superimposed manner with information displayed on the large display 1000, such as a pointer for designating a position on the large display 1000, and a menu for operating a displayed object on the large display 1000, are displayed on the see-through display, and other information which is not required to be seen in a superimposed manner with the displayed object on the large display 1000 may be displayed on the conventional display.
Advantages of using the see-through display for a display at hand are as follows;
(1) Interference between operators can be completely eliminated. Although displaying a pointer or a menu directly on the large display distracts other operators, displaying the pointer or the menu on an operator's own display at hand does not interfere with other operators' work because the display on his own see-through display is not visible to other operators. For instance, when many pointers are displayed on the large display simultaneously, it becomes difficult to identify one specified operator's own pointer among many pointers, and it causes a problem to be solved. However, if a pointer for each operator is displayed only on his own see-through display, the above problem can be eliminated because each operator sees only his own pointer at any time.
(2) Information displayed on the display at hand and information displayed on the large display can be integrated visually. When a conventional display is used as the display at hand, it is necessary to move a line of sight in order to refer both the information displayed on the large display and the information displayed on the display at hand, and it is difficult tc see both of the above information simultaneously. On the contrary, information displayed on the see-through display is visible with information displayed on the large display in a superimposed manner, and both of the above information can be referred to simultaneously. Furthermore, related information can be displayed next to each other.
In accordance with the present invention, the following advantage is achieved:
(1) A process matching with an operator who has operated can be executed. An attribute of the operator is registered corresponding to an input means which is used by the operator, and when the operator operates the input means, a process matching with the operator who has operated can be executed by examining the registered attribute of the operator and selecting a matched process with the attribute of the operator for executing.
(2) An output destination can be selected corresponding to the operator who has operated so as not to distract other operators' task. An attribute of the operator is registered corresponding to an input means which is used by the operator, and when the operator operates the input means, a result can be output without distracting other operators by examining the registered attribute of the operator and selecting an output destination of the result of the processing matched with the attribute of the operator.
(3) An operating environment can be set matching with the operator. An attribute of the operator is registered corresponding to an input means which is used by the operator, and when the operator operates the input means, an operating environment matching with the operator who has operated can be provided by examining the registered attribute of the operator and setting the matched operating environment with the attribute of the operator.
Tani, Masayuki, Yamaashi, Kimiya, Tanikoshi, Koichiro, Futakawa, Masayasu, Horita, Masato, Uchigasaki, Harumi, Nishikawa, Atsuhiko, Hirota, Atsuhiko
Patent | Priority | Assignee | Title |
6642947, | Mar 15 2001 | Apple Inc | Method and apparatus for dynamic cursor configuration |
6661425, | Aug 20 1999 | NEC Corporation | Overlapped image display type information input/output apparatus |
7042469, | Aug 13 2002 | National Instruments Corporation | Multiple views for a measurement system diagram |
7948449, | Mar 31 2005 | Sega Corporation | Display control program executed in game machine |
Patent | Priority | Assignee | Title |
4791478, | Oct 12 1984 | GEC AVIONICS LIMITED, AIRPORT WORKS, ROCHESTER, KENT, ME1 2XX, ENGLAND, A BRITISH COMPANY | Position indicating apparatus |
5418548, | Oct 30 1991 | FUJIFILM Corporation | Terminal for digital network and method of controlling the same |
5491743, | May 24 1994 | International Business Machines Corporation | Virtual conference system and terminal apparatus therefor |
JP1103759, | |||
JP2039307, | |||
JP2273857, | |||
JP3075693, | |||
JP3156557, | |||
JP4218293, | |||
JP4314153, | |||
JP4335698, | |||
JP5046559, | |||
JP62074119, | |||
JP62249276, | |||
JP63308681, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Jul 19 2000 | Hitachi, Ltd. | (assignment on the face of the patent) | / |
Date | Maintenance Fee Events |
Feb 22 2006 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Feb 23 2006 | ASPN: Payor Number Assigned. |
Dec 19 2009 | M1552: Payment of Maintenance Fee, 8th Year, Large Entity. |
Oct 18 2010 | RMPN: Payer Number De-assigned. |
Nov 04 2010 | ASPN: Payor Number Assigned. |
Apr 04 2014 | REM: Maintenance Fee Reminder Mailed. |
Aug 27 2014 | EXP: Patent Expired for Failure to Pay Maintenance Fees. |
Date | Maintenance Schedule |
Aug 27 2005 | 4 years fee payment window open |
Feb 27 2006 | 6 months grace period start (w surcharge) |
Aug 27 2006 | patent expiry (for year 4) |
Aug 27 2008 | 2 years to revive unintentionally abandoned end. (for year 4) |
Aug 27 2009 | 8 years fee payment window open |
Feb 27 2010 | 6 months grace period start (w surcharge) |
Aug 27 2010 | patent expiry (for year 8) |
Aug 27 2012 | 2 years to revive unintentionally abandoned end. (for year 8) |
Aug 27 2013 | 12 years fee payment window open |
Feb 27 2014 | 6 months grace period start (w surcharge) |
Aug 27 2014 | patent expiry (for year 12) |
Aug 27 2016 | 2 years to revive unintentionally abandoned end. (for year 12) |