A method of setting up an operating room including placing at least one surgical device on at least one surface in the operating room, capturing an image of the at least one surgical device with a camera, comparing actual attributes of the at least one surgical device determined using the image captured by the camera with desired attributes of the at least one surgical device stored in a digital preference storage using a computer system, and issuing instruction information of the at least one surgical device in the operating room, the instruction information being dependent on results of the step of comparing.
|
9. A method of recording operating room procedures comprising, at a computing system:
receiving from at least one camera at least one image of personnel in an operating room during a medical procedure on a patient;
automatically identifying at least a portion of the personnel in the operating room from the at least one image; and
based on identifying at least a portion of the personnel in the operating room from the at least one image, recording an identity of at least one identified personnel in the operating room during the medical procedure in an electronic medical record of the patient.
7. A method of recording operating room procedures comprising, at a computing system:
receiving from at least one camera at least one image of personnel in an operating room during a medical procedure on a patient, the personnel not including the patient;
automatically recognizing non-scrubbed personnel in the operating room from the at least one image;
automatically determining a number of times the non-scrubbed personnel violates a sterile field by comparing positions of the non-scrubbed personnel with a predefined region within the operating room associated with the sterile field; and
recording the status of the personnel in a storage.
1. A method comprising, at a computing system:
receiving from at least one camera at least one image of personnel in an operating room during a medical procedure on a patient, the personnel not including the patient;
automatically recognizing the personnel in the operating room from the at least one image;
automatically determining whether essential personnel are present in the operating room during the medical procedure based on recognition of the personnel in the at least one image and information associated with personnel assigned to the medical procedure on the patient; and
in response to determining that at least one essential personnel is missing, generating at least one notification that the at least one essential personnel is missing.
2. The method of
recording the identity of the personnel in the operating room during the medical procedure in an electronic medical record of the patient.
3. The method of
automatically determining a number of times the personnel enters and exits the operating room from the at least one image.
4. The method of
automatically determining a number of times the personnel violates a sterile field from the at least one image by comparing positions of the personnel with a predefined region within the operating room associated with the sterile field.
5. The method of
determining active and idle time of the personnel from the at least one image.
6. The method of
determining hand placement of the personnel during the medical procedure from the at least one image.
8. The method of
10. The method of
|
This application is a divisional of U.S. patent application Ser. No. 15/190,636, filed Jun. 23, 2016, which claims the benefit of U.S. Provisional Application No. 62/183,995, filed Jun. 24, 2015, the entire contents of each of which are incorporated herein by reference.
The present invention relates to a method of setting up a medical care area, such as an operating room, and in particular to a method of arranging medical or surgical devices in an operating room.
Current methods for set up of medical care areas, such as an operative theater, include arranging the medical care area or operative theater according to the instructions on a surgical preference card. For each procedure that a surgeon performs, a separate preference card is maintained. The surgical preference cards outline a variety of items, including surgical equipment preference and layout, patient positioning, and surgical video equipment setup. At a large hospital, where there are many surgeons and many procedures to be performed, thousands of surgical preference cards must be arranged, tracked and utilized.
Surgical preference cards have become extremely important as hospitals push toward more efficient workflows and strive to complete more surgeries in any given day. The surgical preference cards help the surgical staff avoid time consuming (and costly) situations wherein the equipment is improperly arranged in the operative theater prior to surgery and/or essential equipment is missing. Many products have recently been employed to automate management, creation and use of the surgical preference cards including digitizing the surgical preference cards.
A fast, easy and reliable method of arranging the medical or surgical devices in a medical care area is desired.
The present invention, according to one aspect, is directed to a method of setting up an operating room including placing at least one surgical device on at least one surface in the operating room, capturing an image of the at least one surgical device with a camera, comparing actual attributes of the at least one surgical device determined using the image captured by the camera with desired attributes of the at least one surgical device stored in a digital preference storage using a computer system, and issuing instruction information of the at least one surgical device in the operating room, the instruction information being dependent on results of the step of comparing.
Yet another aspect of the present invention is to provide a method of arranging a medical care area. The method includes placing at least one medical or surgical device in the medical care area, capturing an image of the at least one medical or surgical device with a camera, with the image including at least one actual attribute of the at least one medical or surgical device, storing at least one desired attribute of the at least one medical or surgical device in a digital preference storage using a computer system, comparing the at least one actual attribute of the at least one medical or surgical device using the image captured by the camera with the at least one desired attribute stored in the digital preference storage, and issuing instruction information in the medical care area to personnel responsible for arranging the medical care area, the instruction information including at least one of: the number present, the style, the location and the orientation of the at least one medical or surgical device located in the medical care area.
One or more embodiments of the present invention are illustrated by way of example and should not be construed as being limited to the specific embodiments depicted in the accompanying drawings, in which like reference numerals indicate similar elements.
The specific devices and processes illustrated in the attached drawings, and described in the following specification are simply exemplary embodiments of the inventive concepts. Hence, specific dimensions and other physical characteristics relating to the embodiments disclosed herein are not to be considered as limiting.
For purposes of description herein, it is to be understood that the invention may assume various alternative orientations, except where expressly specified to the contrary. It is also to be understood that the specific devices and processes illustrated in the attached drawings, and described in the following specification are simply exemplary embodiments of the inventive concepts defined in the appended claims. Hence, specific dimensions and other physical characteristics relating to the embodiments disclosed herein are not to be considered as limiting, unless the claims expressly state otherwise.
An aspect of the present invention is to ensure that the proper surgical instruments 12 are located on the table 14 and in the proper position on the table 14 according to preferences of particular medical personnel (e.g., a surgeon) or according to a particular procedure being performed.
After the computer system 48 receives the image of the surgical instruments 12 captured by the camera 36 at step 45, the computer system 48 obtains actual attributes of the surgical instruments 12 at step 47. The actual attributes of the surgical instruments 12 can include the number of each of the particular surgical instruments 12, the style of the surgical instruments 12, the brand of the surgical instruments 12, the location/orientation of the surgical instruments 12 on the table 14 and/or the presence of the surgical instruments 12. It is contemplated that other actual attributes of the surgical instruments 12 could be found. The actual attributes of the surgical instruments 12 can be found using an image recognition algorithm (e.g., using Haar Cascade classifier). Such image recognition algorithms are well known to those skilled in the art. It is also contemplated that the surgical instruments 12 could include a linear or matrix bar code thereon for determining the actual attributes of the surgical instruments 12. It is further contemplated that the surgical instruments 12 could include indicators thereon for assisting in determining the actual attributes of the surgical instruments 12. For example, two surgical instruments 12 may have the same outside configuration, but have different internal parts on components. In such a situation, the different surgical instruments 12 could each include a different exterior visual indicator (e.g., a modulated infrared or other spectrum beacon, different colors, or different linear or matrix bar code thereon) to allow the computer system 48 to properly identify the surgical instrument 12.
For the example of the surgical instruments 12 on the table 14 illustrated in
After the computer system 48 obtains actual attributes of the surgical instruments 12 at step 47, the computer system 48 compares the actual attributes of the surgical instruments 12 with desired attributes of the surgical instruments 12 at step 70. The desired attributes of the surgical instruments 12 are stored in a digital preference storage. The digital preference storage can be saved in the computer system 48 or retrievable by the computer system 48. It is contemplated that the computer system 48 may process information and include the digital preference storage on the Internet or other type of wide area network (WAN), a local area network (LAN), a corporate intranet, any other type of network, a combination of such networks, or can be stored in cloud storage retrievable through a network interface of the computer system 48. For example, the digital preference storage can be located in existing hospital IT systems (e.g., a hospital's electronic medical record (EMR)).
In the method 40 of properly locating the surgical instruments 12, if the actual attributes of the surgical instruments 12 are identical to the desired attributes as determined at decision step 72, no further action is taken or the computer system 48 issues instruction information indicating that no further action is needed at step 74. The computer system 48 can issue an indication that no further action is needed using any visual and/or audio notification. For example, the computer system 48 can issue an “OK” message on an associated or attached display or monitor 49, can flash a green light, can issue audio stating that all of the surgical instruments 12 are proper and in the correct location or any combination of the above.
However, if the actual attributes of the surgical instruments 12 are not identical to the desired attributes as determined at decision step 72, the computer system 48 issues instruction information at step 76. The instruction information will provide instructions for correcting the actual attributes of the surgical instruments 12 to be identical to the desired attributes. The instruction information can include displaying the instruction information on a display or monitor 49 of the computer system 48 and/or providing audible directions over a speaker (not shown). The instruction information can include instructions for removing at least one of the surgical instruments 12, adding at least one surgical instrument 12 to the table 14 and/or moving locations of at least one of the surgical instruments 12. For example, the instruction information can include instructions to add another scalpel 26a and another scissors 30a as illustrated in
In the illustrated example, after receiving the instruction information at step 76, the hospital personnel can then conform the actual attributes of the surgical instruments 12 to be identical to the desired attributes at step 78. It is contemplated that the method 40 of properly locating the surgical instruments 12 can return to step 44 after step 78 to ensure that the surgical instruments 12 are properly located.
A further aspect of the present invention is to provide the proper surgical equipment 16 in the proper location within the operating room 10.
One example of the surgical equipment 16 is the image and video capture and recording device 50 located in a control housing 121. The image and video capture and recording device 50 can output images and video on the touchscreen monitor 49, which can be integrated into the control housing 121. The image and video capture and recording device 50 can also output images and video to the additional monitor 135 via either a wired connection or wirelessly. The illustrated image and video capture and recording device 50 is therefore capable of displaying images and videos on the touchscreen monitor 49 and/or on the additional monitor 135 captured live by cameras and/or replayed from recorded images and videos.
The illustrated image and video capture and recording device 50 is also capable of recording images and videos. The image and video capture and recording device 50 can include an internal hard drive for storing captured images and videos and can also communicate with a picture archiving and communication system (PACS), as is well known to those skilled in the art, to save images and video in the PACS and for retrieving images and videos from the PACS. The image and video capture and recording device 50 can also display any saved images (e.g., from the internal hard drive or from the PACS) on the touchscreen monitor 49 and/or the additional monitor 135. It is contemplated that the image and video capture and recording device 50 could obtain or create images of a patient during a surgical procedure from a variety of sources (e.g., from video cameras, video cassette recorders, X-ray scanners (which convert X-ray films to digital files), digital X-ray acquisition apparatus, fluoroscopes, CT scanners, MRI scanners, ultrasound scanners, CCD devices, and other types of scanners (handheld or otherwise)).
Yet another example of the surgical equipment 16 is the camera control unit 124 that is coupled to the video camera 120 by a flexible electronic transmission line 140. The transmission line 140 conveys video data from the video camera 120 to the camera control unit 124 and also conveys various control signals bi-directionally between the video camera 120 and the camera control unit 124. The camera control unit 124 can be connected (wired or wirelessly) to the image and video capture and recording device 50 to provide the images and videos to the image and video capture and recording device 50. Video cameras 120 and camera control units 124 used with scopes 138 are well known to those skilled in the art. An example of the video camera 120 and camera control unit 124 for use with an endoscope is the 1488 HD Camera as sold by Stryker Corporation of Kalamazoo, Mich.
Another example of the surgical equipment 16 is the light source unit 126 that transmits high intensity light into the patient through the scope 138 via a fiber optic cable 144. Light source units 126 used with scopes 138 are well known to those skilled in the art. An example of the light source unit 126 for use with the endoscope 138 is the L9000 LED Light Source as sold by Stryker Corporation of Kalamazoo, Mich.
Yet another example of the surgical equipment 16 is the printer 130. The printer 130 can be connected to the image and video capture and recording device 50 for outputting images from the image and video capture and recording device 50. An example of the printer 130 is the SDP1000 Medical Grade Digital Printer as sold by Stryker Corporation of Kalamazoo, Mich.
Another example of the surgical equipment 16 is the fluid management pump 132. The fluid management pump 132 is employed during surgical procedures to introduce sterile solution into surgical sites and to remove fluid and debris generated by the procedure. In the illustrated example, the fluid management pump 132 can supply the motive force for pumping the sterile solution through an inflow tube (not shown) into the surgical site via a cannula. The fluid management pump 132 can also supply the motive force for suctioning solution and any waste material removed from the surgical site from an outflow tube 147 to a waste tube 137 connected to the waste container cart 19e. In the illustrated example, the outflow tube 147 is connected to the shaver 136. An example of the fluid management pump is disclosed in U.S. Patent Application Publication No. 2013/0267779 entitled CONTROL FOR SURGICAL FLUID MANAGEMENT PUMP SYSTEM, the entire contents of which are hereby incorporated herein by reference. An example of the shaver 136 is the FORMULA® Shaver Hand Piece as sold by Stryker Corporation of Kalamazoo, Mich.
Yet another example of the surgical equipment 16 is the RF and shaver control 134. The RF and shaver control 134 sends power to an ablation and coagulation device or electrosurgical tool (not shown) and/or the shaver 136. Ablation and coagulation devices are well known to those skilled in the art. An example of an ablation and coagulation device that can be connected to the RF and shaver control 134 is the SERFAS™ Energy Probe as sold by Stryker Corporation of Kalamazoo, Mich. The RF and shaver control 134 sends power to the shaver 136 through a cable 143. An example of the RF and shaver control 134 is the CROSSFIRE® arthroscopic resection system as sold by Stryker Corporation of Kalamazoo, Mich.
Another example of the surgical equipment 16 is the insufflator 141. The insufflator 141 is used to supply inert, nontoxic gases, such as carbon dioxide, into a body cavity, in order to expand the cavity, or to minimize visual obstruction during minimally invasive or laparoscopic surgery. An insufflator 141 is well known to those skilled in the art. An example of the insufflator 141 is the PNEUMOSURE® 45 L Insufflator as sold by Stryker Corporation of Kalamazoo, Mich. Further examples or surgical equipment 16 include stand alone pieces of surgical equipment 16 such as a portable monitor 135a and a portable overhead light 56a.
An aspect of the present invention is to ensure that the proper surgical equipment 16 is located in the operating room 10 and in the proper location in the operating room 10 according to preferences of particular medical personnel (e.g., a surgeon) or according to a particular procedure being performed.
After the computer system 48 receives the image of the surgical equipment 16 captured by the camera 36 at step 104, the computer system 48 obtains actual attributes of the surgical equipment 16 at step 108. The actual attributes of the surgical equipment 16 can include the number of each piece of surgical equipment 16, the style of the surgical equipment 16, the brand of the surgical equipment 16, the location/orientation of the surgical equipment 16 on the floor 18 or on a cart 19a, etc. and/or the presence of the surgical equipment 16 and/or a cart 19a, etc. with the surgical equipment 16 thereon. It is contemplated that other actual attributes of the surgical equipment 16 could be found. The actual attributes of the surgical equipment 16 can be found using an image recognition algorithm (e.g., using Haar Cascade classifier). Such image recognition algorithms are well known to those skilled in the art. It is also contemplated that the surgical equipment 16 could include a linear or matrix bar code thereon for determining the actual attributes of the surgical equipment 16. It is further contemplated that the surgical instruments 12 could include indicators thereon for assisting in determining the actual attributes of the surgical instruments 12. For example, two surgical instruments 12 may have the same outside configuration, but have different internal parts on components. In such a situation, the different surgical instruments 12 could include each include a different exterior visual indicator (e.g., a modulated infrared or other spectrum beacon, different colors, or different linear or matrix bar code thereon) to allow the computer system 48 to properly identify the surgical instrument 12.
For the example of the surgical equipment 16 on the floor 18 of the operating room 10 illustrated in
After the computer system 48 obtains actual attributes of the surgical equipment 16 at step 108, the computer system 48 compares the actual attributes of the surgical equipment 16 with desired attributes of the surgical equipment 16 at step 110. The desired attributes of the surgical equipment 16 are stored in a digital preference storage. The digital preference storage can be saved in the computer system 48 or retrievable by the computer system 48 as outlined above.
In the method 100 of properly locating the surgical equipment 16, if the actual attributes of the surgical equipment 16 are identical to the desired attributes as determined at decision step 112, no further action is taken or the computer system 48 issues instruction information indicating that no further action is needed at step 114. The computer system 48 can issue an indication that no further action is needed using any visual and/or audio notification. For example, the computer system 48 can issue an “OK” message on the display or monitor 49, can flash a green light, can issue audio stating that all of the surgical equipment 16 are proper and in the correct location or any combination of the above.
However, if the actual attributes of the surgical equipment 16 are not identical to the desired attributes as determined at decision step 112, the computer system 48 issues instruction information at step 116. The instruction information will provide instructions for correcting the actual attributes of the surgical equipment 16 to be identical to the desired attributes. The instruction information can include displaying the instruction information on a display or monitor 49 of the computer system 48 and/or providing audible directions over a speaker (not shown). The instruction information could also include instructions for locations of pieces of surgical equipment 16 that is not in the room 10 but should be in the room 10. The instruction information can include instructions for removing at least one of the pieces of surgical equipment 16 (including carts 19a, etc.), adding at least one piece of surgical equipment 16 (including carts 19a, etc.) to the room 10 and/or moving locations of at least one of the pieces of surgical equipment 16 (including carts 19a, etc.).
For example, the instruction information can include instructions to add surgical equipment, remove surgical equipment or rearrange the surgical equipment 16 in the operating room 10.
In the illustrated example, after receiving the instruction information, the hospital personnel can then conform the actual attributes of the surgical equipment 16 to be identical to the desired attributes at step 118. It is contemplated that the method 100 of properly locating the surgical equipment 16 can return to step 104 after step 118 to ensure that the surgical equipment 16 is properly located.
It is contemplated that the computer system 48 can be programmed to observe the layout of the surgical devices in the operating room 10 and record the actual attributes of the surgical devices to form the desired attributes of the surgical devices to be stored in the digital preference storage. It is further contemplated that the computer system can obtain desired configurations for the surgical equipment from the digital preference storage associated with a particular person to be using the operating room (e.g., surgeon) and/or with a particular procedure to be performed and configure the surgical equipment according to the desired configurations. For example, the procedure for configuring surgical equipment as set forth in U.S. Patent Application No. 62/100,286 entitled METHOD OF CONFIGURING DEVICES IN AN OPERATING THEATER, the entire contents of which are hereby incorporated by reference, can be used.
Another aspect of the present invention is to obtain images of numerous people/personnel 400 and objects in a medical facility and saving and analyzing the images to improve efficiency of the medical facility. In this aspect of the present invention, sensors and/or cameras 320 are located throughout the medical facility to track potentially everything moving within the medical facility.
The illustrated camera and/or sensors 320 can potentially track everything moving through the viewing area of the camera and/or sensors 320. The camera and/or sensors 320 may be active or passive and can capture images or sense personnel, movement and medical devices (and other objects). The camera and/or sensors 320 can have a wide-angle lens and processing software that tracks personnel, movement, medical devices (and other objects) and patterns. The camera and/or sensors 320 can also have the capability to capture depth information using an active scanning method (e.g., a 3D scanner as is well known in the art). The camera and/or sensors 320 can capture images in color, black and white, or in the infrared. Examples of the camera and/or sensors 320 can include the room camera 36b fixed to walls 52 or the ceiling 54 of the room 10 as outlined above and the camera 36c in the overhead light 56. It is contemplated that the camera and/or sensors 320 can include a combination of motion sensor and camera wherein the camera is activated when motion is sensed by the motion sensor. It is further contemplated that the camera and/or sensors 320 can be composed of sensors that can sense passage of personnel and medical devices without capturing an optical image thereof (e.g., by reading RFID chips on the personnel and medical devices).
In the illustrated example, the captured images and/or sensed personnel and medical devices (and other items) are processed to determine the personnel and medical devices (and other items) passing through an area in front of the camera and/or sensors 320. It is contemplated that the camera and/or sensors 320 can have an on-board computer system to analyze the personnel and medical devices (and other items) to determine the characteristics thereof. For example, the camera and/or sensors 320 can have a computer system that includes one or more processors or other similar control devices as well as one or more memory devices. The processor controls the overall operation of the computer system and can include hardwired circuitry, programmable circuitry that executes software, or a combination thereof. The processor may, for example, execute software stored in the memory device. The processor may include, for example, one or more general- or special-purpose programmable microprocessors and/or microcontrollers, application specific integrated circuits (ASICs), programmable logic devices (PLDs), programmable gate arrays (PGAs), or the like. The memory device may include any combination of one or more random access memories (RAMs), read-only memories (ROMs) (which may be programmable), flash memory, and/or other similar storage devices. It is contemplated that the computer system for the camera and/or sensors 320 can run an image recognition algorithm (e.g., using Haar Cascade classifier) to analyze the personnel and medical devices (and other items) to determine the characteristics thereof. It is further contemplated that the personnel and medical devices (and other items) could include indicators thereon (e.g., different exterior visual indicators as outlined above) for assisting in determining the characteristics thereof. For determining the identity of the personnel, facial recognition and/or other features (e.g., height, walking gait, clothing, etc.) can be employed to properly identify the particular personnel. The computer system for the camera and/or sensors 320 can then send the aggregate information on the personnel and medical devices (and other items) to a central computer system 399 (via a wired system or wirelessly). Alternatively, the camera and/or sensors 320 can send captured images and/or sensed information to the central computer system 399 (via a wired system or wirelessly) for recognition and analysis by the central computer system 399. The central computer system 399 can also include one or more processors or other similar control devices as well as one or more memory devices as outlined above.
The illustrated central computer system 399 uses the information on the personnel and medical devices (and other items) along with further information to identify and measure opportunities for efficiency improvements that exist with a day of surgery workflow, optimize room design elements by specifying equipment placement and personnel movement, and standardize care in an effort to improve patient outcomes. One example of further information is usage details of medical devices 16. For example, the amount of usage of the shaver 136 (e.g, speed and time), type of images recorded in the image and video capture and recording device 50, type of light emitted from the scope light source unit 126, type and/or amount of fluid passed using the fluid management pump 132, usage of the insufflator 141, and usage of an additional monitor 135, with all of this information being sent to the central computer system 399 (either directly or through another system (e.g., from the image and video capture and recording device 50 when the image and video capture and recording device 50 is connected to the other medical devices 16 in the room 10)). The above list is for example purposes only and is not exhaustive. The usage details from the medical devices 16 can be retrieved by the central computer system 399 or can be sent to the central computer system 399 at a rate dependent and unique to each medical device 16. Moreover, the method 100 of properly locating the surgical equipment 16 can include an associated method of properly locating personnel 400. In the associated method of properly locating personnel 400, facial recognition software can be used to determine the personnel 400 in the room 10 and providing instruction information in the method 100 could also include instructions for adding essential personnel for a particular procedure that are currently absent from the room 10. For determining the identity of the personnel, other features in place of or in addition to facial recognition (e.g., height, walking gait, clothing, etc.) can be employed to properly identify the particular personnel.
Finally, the collected data is analyzed at step 408 to optimize the performance to thereby improve efficiency in the medical facility. The analyzed data can be used to identify and measure opportunities for efficiency improvements that exist. Software data processing can provide actionable intelligence in real-time or on-demand to pre-defined user groups. The software data processing can be done locally on-site on custom processing hardware or on available server infrastructure, or done remotely in a cloud configuration. The computer system can provide reports and alerts to nurses, surgeons, technicians, and administrators. The collected data can be stored, analyzed, and available per surgeon/procedure, patient, and institution, and can be used to assist the surgical staff in standardizing care across surgical units, institutions, and regions.
One example of an opportunity for improving efficiency is by tracking personnel movement and patterns. The tracking information can include tracking patient, physician, scrub tech, nurse/non-scrub, personnel not assigned to a current procedure, and unidentified non-hospital personnel entry into and exit from the room 10. Facial recognition software can be employed in step 404 of the method 401 to determine the identity of the personnel. Other features in place of or in addition to facial recognition (e.g., height, walking gait, clothing, etc.) can be employed to properly identify the particular personnel. Such personnel information can be analyzed (along with further information) to determine, for example, the most efficient personnel for a procedure, for tracking the personnel to improving staffing policies and for determining if improved security is needed.
In the illustrated example, the facial recognition and/or other features as outlined above can be used to determine the personnel 400 in the room 10 and saved as part of the medical record. It is contemplated that the cameras 320 including room camera 36b fixed to walls 52 or the ceiling 54 of the room 10 (e.g., the room camera 36b as shown or a camera 36c in an overhead light 56), a 360° camera, a wide-angle camera, a camera on the computer system 48, the video camera 120 and/or any other camera in the room 10 can be used in the process of identifying the personnel 400 in the room 10. Once the images of the personnel 400 in the room 10 are obtained, facial recognition and/or other features as outlined above can be used to determine the identity of the personnel 400. It is contemplated that the cameras 320 can take an image of everyone in the room 10 at a particular time (e.g., automatically (for example, when the room is scheduled to have surgery performed therein) or manually (for example, by pressing an icon on a touchscreen attached to the computer system 48)). It is also contemplated that the cameras 320 can take images of the personnel 400 in the room 10 over a series of time frames or constantly (e.g, every minute during the time the procedure is scheduled, every time one of the doors 306, 308 is opened or constant viewing looking for any additional personnel 400 that enters the room 10). The opening of the doors 306, 308 can be viewed using cameras and/or determined from barometric changes in the room. Once the identity of the personnel 400 in the room 10 is determined, a record of the personnel 400 can be saved automatically or manually to the record of the patient (e.g., in the EMR). The identity of the personnel 400 in the room 10 can also be saved in an operative note of the procedure. For example, the personnel 400 in the room 10 during a procedure can be saved in a surgical note created using the process set forth in U.S. patent application Ser. No. 14/853,289 entitled INTRA-SURGICAL DOCUMENTATION SYSTEM.
In the illustrated example, it is contemplated that the facial recognition and/or other recognition techniques as outlined above can be used to confirm the identity of the personnel 400 after the identity of the personnel 400 has been entered into the computer system (e.g., automatically from a scheduling program or manually) or after the identity of the personnel 400 has been identified using another automatic system (e.g., by reading an RFID chip worn by the personnel). Mismatches between the reading using facial recognition and/or other recognition techniques as outlined above and the identity of the personnel 400 entered into the computer system or identified using another automatic system can be flagged for additional review. If the personnel 400 is not entered into the computer system or identified using another automatic system, the identity of the personnel 400 can be confirmed in other manners (e.g., having the personnel 400 speak their name for recordation or enter their name into the computer system). It is further contemplated that the computer system can raise an alarm if improper or blacklisted personnel 400 are in the room.
Another example of an opportunity for improving efficiency is by tracking setup and cleanup of the room 10. For example, the following can be tracked: number of personnel involved, total time of setup and/or cleanup, active working time vs. idle time, time between completion of cleanup and start of next case setup, and time for setting up the room 10 per procedure type. The cameras and/or sensors 320 can capture images and/or sense information in the room 10 (in step 402) and the further information (e.g., schedule for the room and timing) (in step 406) can be analyzed in step 408 to improve the efficiency of cleaning up and setting up the room 10. For example, the quickest clean ups and set ups can be analyzed to determine the most efficient method of cleaning up and setting up the room 10 to be used in future clean ups and set ups. Moreover, personnel can be rerouted to other areas during their idle time to improve the efficiency of the personnel. Furthermore, the time between clean up and start of next setup can be analyzed to reduce the time the room 10 is not being used to maximize use of the room 10.
The analyzed data can also be used for optimizing room design elements for the room 10 and other areas accessed during the day of surgery. The impacted design elements include (but are not limited to): floor plan layout, reflective ceiling plan, equipment placement, optimal staff positioning, storage requirements, optimal size of treatment area, and general workflow efficiency improvements within the hospital. The collected information can include a height of the patient surgical table 200, position of surgical scrubbed staff and physician per procedure, movement of non-scrubbed personnel, entry/exit path of patient, entry/exit path of intra-operative equipment, recognition of case preferences (e.g., where equipment, instruments, and other supplies are placed/positioned per procedure type), movement/positioning of ceiling mounted equipment, number of times equipment was moved or reconfigured, equipment usage/durations. The efficiency of the room can be optimized by analyzing the collected information and specifying equipment placement and personnel movement in future procedures.
The analyzed data can further be used for optimizing infection control and sterile processing. For example, the number of infection incidents, location of infection incidents, number of personnel entries into and exits from the room 10 through the doors 306, 308, duration that the doors 306, 308 are open, number of sterile field violations (i.e., non-scrubbed personnel within 12 inches of the sterile field or sterile back table), sterile field transfer protocol violations, sterile processing department staff time spent on cleaning of the instruments 12, sterile processing department workflow process, and a percentage of critical areas cleaned (e.g., by visually determining whether an area was wiped/cleaned). Such information can be analyzed to reduce infections or to see where infections occur to determine which actions can be taken in the future to reduce the possibility of infection. For example, it is contemplated that the number of times the doors 306, 308 are opened can be associated with post-operative infection information to determine if there is a correlation between the number of times the doors 306, 308 are opened and post-operative infection. If there is a correlation, the medical center can establish procedures for an allowable number of door openings during a particular procedure. Such information (e.g., number of door openings) could be saved with the patient record (e.g., in the EMR). The system could also ascertain a reason for the doors to be opened and a reason for the ingress/egress of personnel for improving workflow efficiency and planning. For example, if a particular type of nurse or doctor has to leave the room 10 several times or enter/exit after the beginning of a procedure, such information could be used to assist in better allocating the schedule and time of that person. Furthermore, the idle time of particular personnel (or type of personnel (e.g., nurse)) could be determined to allow for the particular personnel (or type of personnel (e.g., nurse)) to be reallocated during the typical idle times thereof.
The analyzed data can also be used for optimizing care of a patient. For example, the following information can be tracked: active vs. idle time of each staff member during a procedure, drug administration times and medication error, improvements of the patient over time, delay in treatment, patient movement/lack of movement, patient fall warning/traceability, and location of surgery. If the central computer system 399 determines that something is improper after analyzing the data, the central computer system 399 can provide warnings (e.g., warning of potential wrong-site surgery) to improve care of the patient.
The analyzed data can further be used for attempting to determine causes for readmission of a patient. For example, the following information can be obtained: op/post-op traceability, a patient readmission to metrics that occurred during their continuum of care (admission to discharge), infection rate of treatment room the patient was treated in vs. average infection rate for other rooms, number of non-essential people in treatment room vs. average for similar cases, time and/or thoroughness spent on terminal cleaning of treatment room prior to the medical procedure, number of times a non-sterile door was opened during the medical procedure and/or any violations of a sterile field or back table that occurred during the medical procedure. All of the information can be analyzed to determine steps that can be taken in the future to minimize possibilities of readmission of the patient.
The analyzed data can also be used for tracking and improving medical procedures. For example, the following can be observed and recorded: usage and duration of use of the medical devices, personnel using the medical devices, number of sponges and/or needles used during the medical procedure (e.g., to ensure none are lost during the medical procedure), camera position for being minimally invasive during a particular medical procedure, registration and confirmation of implant sizes, anatomical placement of ports (e.g., trocars, scopes and incisions), hand placement of staff and physicians per procedure type, surgical techniques, wasted movements, idle personnel time, handling of instruments (e.g., when and by whom), time and duration of usage of any cutting or RF instrument, estimates on the volume of blood loss or fluid use, time of use of any disposable instrument, and time of activation of any device (e.g., activation of light source). Moreover, it is contemplated that algorithms can be used to identify the personnel 400 entering and exiting the room 10 along with the function of the personnel 400 entering and exiting the room 10 during a particular procedure. Such collected data can be used to develop plans for improving medical procedures. Such information (e.g., observation of number of devices (for example, sponges or needles) used in a procedure or critical procedural steps of a procedure) could prompt notification (e.g., by video or audio alerts) for corrective action that needs to be taken if important devices or steps are skipped or missed.
For all of the image recognition techniques outlined above (e.g., facial and/or other features as outlined above and device recognition), it is contemplated that a database of information needed to recognize the item or person in the image could be stored locally (e.g., in a memory of the computer system 48, 399) or externally. For example, the database of information can be stored externally and obtainable through the Internet or other type of wide area network (WAN), a local area network (LAN), a corporate intranet, any other type of network, a combination of such networks, or can be stored in cloud storage retrievable through a network interface of the computer system 48, 399. Information can be saved in the database of information by saving images into the database of information through any means (e.g., a web application or a mobile application) and associating names (e.g., name of person) or other information (e.g., type of device) with each particular image.
The analysis of video and/or images of the room 10 can be used for recognizing specific conditions of the room 10 and triggering or taking further action within the room depending on the specific conditions. For example, it is contemplated that the cameras 320 could review the rooms 10, and if it determined that no personnel 400 are in the room 10 (and possibly the additional determination that the time of day is outside normal utilization hours for the room 10) the computer system could trigger an ultraviolet room sterilization system 700 to eliminate pathogens in the room 10. If is further contemplated that the computer system 48, 399 could automatically turn off the ultraviolet room sterilization system 700 if the cameras 320 determine the presence of personnel 400 in the room 10 (e.g., personnel 400 entering the room) unless the computer system 48, 399 is programmed to determine if the personnel 400 is wearing protection gear (e.g., a special suit having a recognizable code or a particular type of reflectivity that can be visually determined) that protects the personnel 400 for ultraviolet light. It is contemplated that the system could take other action upon determination that personnel 400 are in or not in the room 10. For example, devices could be automatically turned on or off depending on whether there are personnel 400 in the room 10 or not in the room 10.
In the illustrated example, one or more of the cameras 320 (e.g., the room cameras) could capture video or images outside of the visible spectrum and the computer system 48, 399 could take action depending on the analysis of the video or images outside of the visible spectrum. For example, the cameras 320 could be infrared cameras that turn off equipment or provide notification that equipment is too hot for its intended purpose. Therefore, the equipment could be replaced before the equipment malfunctions. The cameras 320 could also be a hyperspectral imaging camera or a multispectral imaging camera. The hyperspectral imaging camera or the multispectral imaging camera could also sense light waves outside of the visible spectrum. The hyperspectral imaging camera or the multispectral imaging camera could provide video or images to the computer system 48, 399 to allow the computer system 48, 399 to notify the personnel 400 of certain conditions or take automatic action under certain conditions. For example, the computer system 48, 399 could alert the personnel 400 if the hyperspectral imaging camera or the multispectral imaging camera detect waste anesthesia gases venting into the room 10 instead of into a waste anesthesia gas disposal system. In another example, the computer system 48, 399 could alert the personnel 400 if the hyperspectral imaging camera or the multispectral imaging camera detect an increased volume of contaminants (e.g., dust or other particular matter) venting into the room 10 through the HVAC system.
The illustrated room 10 can also include one or more microphones 800 to receive audio in the room 10 for analysis by the computer system 48, 399. The microphone 800 can be fixed within the room 10 or can be portable and be wired or wireless connected to the computer system 48, 399. For example, it is contemplated that the microphone 800 could be worn by the personnel 400 (e.g., surgeon) or could be located in any of the devices. The computer system 48, 399 could analyze the audio received by the microphone 800. For example, the computer system 48, 399 could monitor the sounds from the medical equipment 16 for alarms or other telltale sounds that the equipment is not working properly (e.g., a high-pitched whine indicating a clogged air intake filter) and alert the personnel 400 in the room 10 or outside of the room 10 if an alarm count exceeds a preset number for the procedure in the room 10. The alert could be audio within the room, audio to personnel 400 outside the room, turning down music within the room, dimming or flashing lights in the room, placing text or other indicators on monitors or any other method of alerting personnel of issues. The computer system 48, 399 could monitor instructions from one of the personnel 400 in the room 10 (e.g., a doctor) and provide an audio or visual alert (e.g., on a monitor) if a discrepancy is detected between the instruction and the response by reviewing the audio response or by analyzing the video or images from the cameras 320 for the non-verbal action taken from the instructions. The computer system 48, 399 could monitor communications of the personnel 400 in the room 10 (e.g., a doctor) and automatically reduce the volume level of music playing in the room 10 under certain conditions (e.g., during prolonged verbal communications between the personnel 400 or when elevated levels of stress in the voices of the personnel 400 is detected). In addition to or as an alternative to reducing the volume level of music playing in the room 10 under the certain conditions, additional personnel 400 could be automatically called to the room 10. The audio recorded by the microphones 800 can be saved on the computer system 48, 399 or in the patient's record for later analysis (e.g., analysis to the audio to make correlations between interruption frequency (from, for example, equipment alarms, phone calls, etc.) and post-operative recovery issues so that potential conclusions can be drawn for improvement in patient safety.
In the illustrated embodiments as outlined above, instructions are given to personnel for many reasons. For example, instructions can be given to personnel in order to match actual attributes of the surgical devices with desired attributes of the surgical devices. It is contemplated that augmented reality system can be used to provide the instructions to the personnel in order to allow them to match the actual attributes with the desired attributes. An example of an augmented reality system that could be used is the Microsoft HoloLens as sold by Microsoft Corporation of Redmond, Wash. The augmented reality system can be worn by the personnel to show exactly where the surgical devices should be positioned.
Although particular preferred embodiments of the invention have been disclosed in detail for illustrative purposes, it will be recognized that variations or modifications of the disclosed apparatus, including the rearrangement of parts, lie within the scope of the present invention.
Schultz, Andrew, Hastings, Sean, Beutter, Richard A.
Patent | Priority | Assignee | Title |
Patent | Priority | Assignee | Title |
10020075, | Mar 24 2009 | LEAF HEALTHCARE, INC | Systems and methods for monitoring and/or managing patient orientation using a dynamically adjusted relief period |
10269454, | Jan 06 2015 | Stryker Corporation | Method of configuring devices in an operating theater |
10528840, | Jun 24 2015 | Stryker Corporation | Method and system for surgical instrumentation setup and user preferences |
10600204, | Dec 28 2016 | Ocuvera | Medical environment bedsore detection and prevention system |
10631783, | Dec 11 2015 | HEALTHTEXTILES I SVERIGE AB | Method and a system for monitoring healthcare garments |
10688269, | Apr 13 2016 | DRÄGERWERK AG & CO. KGAA | Gas sensor for anesthetic gases and its use |
10859474, | Feb 28 2013 | TRICORNTECH TAIWAN | Real-time on-site gas analysis network for ambient air monitoring and active control and response |
11040155, | May 04 2018 | AFS MEDICAL GMBH | Integrated monitoring and management system for endoscopic surgery |
11051751, | Apr 22 2010 | LEAF HEALTHCARE, INC | Calibrated systems, devices and methods for preventing, detecting, and treating pressure-induced ischemia, pressure ulcers, and other conditions |
11076778, | Dec 03 2020 | Vitalchat, Inc. | Hospital bed state detection via camera |
5376796, | Nov 25 1992 | ADAC Laboratories | Proximity detector for body contouring system of a medical camera |
5432703, | Jun 30 1993 | CLYNCH TECHNOLOGIES, INC | Laser digitizer system for producing orthotic and prosthetic devices |
5477371, | Dec 13 1993 | SHAFIR PRODUCTION SYSTEMS LTD | Three-dimensional, non-contact scanning apparatus and method |
5627586, | Apr 09 1992 | Olympus Optical Co., Ltd. | Moving body detection device of camera |
6044288, | Nov 08 1996 | Imaging Diagnostics Systems, Inc. | Apparatus and method for determining the perimeter of the surface of an object being scanned |
6119033, | Mar 04 1997 | FOLEY HOAG & ELIOT, LLP | Method of monitoring a location of an area of interest within a patient during a medical procedure |
6223137, | Mar 25 1999 | University of Tennessee Research Foundation | Method for marking, tracking, and managing hospital instruments |
6442419, | Sep 20 2000 | Transpacific IP Ltd | Infrared 3D scanning system |
6486778, | Dec 17 1999 | SIEMENS BUILDING TECHNOLOGIES AG, CERBERUS DIVISION | Presence detector and its application |
6987448, | Aug 20 2001 | BEACONMEDAES LLC | Medical gas alarm system |
7248933, | May 08 2001 | Hill-Rom Services, Inc | Article locating and tracking system |
7378975, | Jun 09 2000 | Bed-Check Corporation | Method and apparatus for mitigating the risk of pressure sores |
7457804, | May 10 2002 | Bayer HealthCare LLC | System and method for automated benchmarking for the recognition of best medical practices and products and for establishing standards for medical procedures |
7768414, | May 25 2005 | BEACONMEDAES LLC | Medical gas alarm system |
7783676, | May 19 2006 | Universal Electronics Inc. | System and method for using image data in connection with configuring a universal controlling device |
8231664, | Feb 26 2009 | ADVANCED COOLING THERAPY, INC | Devices and methods for controlling patient temperature |
8452615, | Nov 13 2007 | KARL STORZ SE & CO KG | Method and system for management of operating-room resources |
8754945, | Oct 12 2010 | Hon Hai Precision Industry Co., Ltd. | Image capturing device and motion tracking method |
9183602, | Jun 23 2011 | Cerner Innovation, Inc.; CERNER INNOVATION, INC | Medical device interfacing using a camera |
9280884, | Sep 03 2014 | Oberon, Inc. | Environmental sensor device with alarms |
9305218, | Jun 14 2012 | PREZIO HEALTH | Methods and systems for identifying, marking, and inventorying large quantities of unique surgical instruments |
9442070, | Oct 05 2004 | Photon Systems | Native fluorescence detection methods, devices, and systems for organic compounds |
9452339, | Jun 25 2015 | LILA ATHLETICS INC.; LILA ATHLETICS INC | Automated ball launching system |
9693891, | Sep 11 2012 | PINTLER MEDICAL LLC | Cost-effective systems and methods for enhanced normothermia |
9807475, | Aug 14 2014 | YRIBUS TECHNOLOGIES, LLC | Methods and systems for sensing ambient conditions using passive radio frequency (RF) devices |
9956113, | Mar 12 2013 | The Board of Trustees of the Leland Stanford Junior University | Method and system for regulating core body temperature |
20030216836, | |||
20040186683, | |||
20060234175, | |||
20060243720, | |||
20070027403, | |||
20070239482, | |||
20080281301, | |||
20080312963, | |||
20090300507, | |||
20090323121, | |||
20090326336, | |||
20100225746, | |||
20100290698, | |||
20120014562, | |||
20120078144, | |||
20120083652, | |||
20120140068, | |||
20120268280, | |||
20120316987, | |||
20130247921, | |||
20130267779, | |||
20140006943, | |||
20140036110, | |||
20140039351, | |||
20140267770, | |||
20140276056, | |||
20140285432, | |||
20140358044, | |||
20150086072, | |||
20150149330, | |||
20150190202, | |||
20150288924, | |||
20150302157, | |||
20150310718, | |||
20150317068, | |||
20160061794, | |||
20160061795, | |||
20160078307, | |||
20160085922, | |||
20160103810, | |||
20160120691, | |||
20160203699, | |||
20160220323, | |||
20160224999, | |||
20160239718, | |||
20160239795, | |||
20160314258, | |||
20160379504, | |||
20170140113, | |||
20170185864, | |||
20170231577, | |||
20170281073, | |||
20180011983, | |||
20180071137, | |||
20180243025, | |||
20190104982, | |||
20190307405, | |||
20190311599, | |||
20190328598, | |||
20190374375, | |||
20190374387, | |||
20200113488, | |||
20200155059, | |||
20200245950, | |||
20200246180, | |||
20200405217, | |||
20210038084, | |||
CN111352378, | |||
CN112426266, | |||
CN202724392, | |||
CN204618192, | |||
CN211432863, | |||
DE4339379, | |||
EP2143090, | |||
EP2391317, | |||
EP3556318, | |||
WO2004014246, | |||
WO2007106040, | |||
WO2010086740, | |||
WO2013104420, | |||
WO2013186160, | |||
WO2015092627, | |||
WO2017183602, | |||
WO2020031147, | |||
WO2020165918, | |||
WO2020243527, | |||
WO2020264140, | |||
WO2021021609, | |||
WO2021096996, | |||
WO2021097367, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Jan 06 2020 | Stryker Corporation | (assignment on the face of the patent) | / |
Date | Maintenance Fee Events |
Jan 06 2020 | BIG: Entity status set to Undiscounted (note the period is included in the code). |
Date | Maintenance Schedule |
Jun 21 2025 | 4 years fee payment window open |
Dec 21 2025 | 6 months grace period start (w surcharge) |
Jun 21 2026 | patent expiry (for year 4) |
Jun 21 2028 | 2 years to revive unintentionally abandoned end. (for year 4) |
Jun 21 2029 | 8 years fee payment window open |
Dec 21 2029 | 6 months grace period start (w surcharge) |
Jun 21 2030 | patent expiry (for year 8) |
Jun 21 2032 | 2 years to revive unintentionally abandoned end. (for year 8) |
Jun 21 2033 | 12 years fee payment window open |
Dec 21 2033 | 6 months grace period start (w surcharge) |
Jun 21 2034 | patent expiry (for year 12) |
Jun 21 2036 | 2 years to revive unintentionally abandoned end. (for year 12) |