A system and method for providing image guidance for placement of one or more medical devices at a target location. The system can be used to determine one or more affected regions corresponding to the operation of one or more medical devices and display at least a portion of the one or more affected regions. The affected regions can correspond to predicted affected regions and/or dynamic affected regions and can be based at least in part on a variance parameter of the medical device.

Patent
   10820944
Priority
Oct 02 2014
Filed
Jan 29 2018
Issued
Nov 03 2020
Expiry
Oct 01 2035

TERM.DISCL.
Assg.orig
Entity
Large
9
512
currently ok
1. A method, comprising:
receiving first emplacement data corresponding to a first medical device;
determining an emplacement of a virtual medical device with respect to a point-of-view location based at least in part on the first emplacement data, wherein the virtual medical device corresponds to the first medical device;
determining a first estimated ablation volume based at least in part on real-time ablation data associated with the first medical device and the first emplacement data;
receiving operating parameters corresponding to the first medical device, the operating parameters comprising a variance parameter that corresponds to a margin of error associated with said determining the first estimated ablation volume;
determining a second estimated ablation volume based at least in part on the variance parameter and the first emplacement data, wherein a size of the second estimated ablation volume is different from a size of the first estimated ablation volume; and
causing one or more displays to display:
a perspective view of at least a portion of the virtual medical device,
a perspective view of the first estimated ablation volume, and
a perspective view of the second estimated ablation volume.
17. A computer-readable, non-transitory storage medium storing computer-executable instructions that when executed by one or more processors cause the one or more processors to:
receive first emplacement data corresponding to a first medical device;
receive second emplacement data corresponding to a second medical device;
determine an emplacement of an image slice with respect to a point-of-view location, based on the second emplacement data;
determine a first ablation region on the image slice based at least in part on real-time ablation data associated with the first medical device, the determined emplacement of the image slice, and the first emplacement data;
identify a variance parameter corresponding to the first medical device, wherein the variance parameter corresponds to a margin of error associated with the determination of the first ablation region;
determine a second ablation region on the image slice based at least in part on the variance parameter, the determined emplacement of the image slice, and the first emplacement data, wherein the second ablation region is different in size from the first ablation region; and
cause one or more displays to display:
a perspective view of the image slice based at least in part on the emplacement of the image slice,
the first ablation region on the image slice, and
the second ablation region on the image slice.
5. A system, comprising:
one or more processors communicatively coupled with one or more displays, and a non-transitory computer-readable storage medium storing computer-executable instructions that when executed by the one or more processors cause the one or more processors to:
receive emplacement data corresponding to a medical device;
determine emplacement of a virtual medical device corresponding to the medical device with respect to a point-of-view location based at least in part on the emplacement data corresponding to the medical device;
determine an estimated ablation volume based at least in part on real-time ablation data associated with the medical device;
receive operating parameters corresponding to the medical device, the operating parameters comprising at least a variance parameter that corresponds to a variance in a size of the estimated ablation volume;
determine a first predicted affected region based at least in part on the variance parameter, wherein the first predicted affected region is smaller than the estimated ablation volume;
determine a second predicted affected region based at least in part on the variance parameter, wherein the second predicted affected region is larger than the estimated ablation volume;
determine emplacement of the first predicted affected region and the second predicted affected region with respect to the point-of-view location based at least in part on the emplacement data corresponding to the medical device; and
cause one or more displays to concurrently display:
a perspective view of at least a portion of the virtual medical device based at least in part on the emplacement of the virtual medical device,
a perspective view of the estimated ablation volume,
a perspective view of at least a portion of the first predicted affected region based at least in part on the determined emplacement of the first predicted affected region, and
a perspective view of at least a portion of the second predicted affected region based at least in part on the determined emplacement of the second predicted affected region.
2. The method of claim 1, wherein the second estimated ablation volume corresponds to at least one of a smallest-possible ablation volume of the first medical device based on the operating parameters or a largest-possible ablation volume of the first medical device based on the operating parameters.
3. The method of claim 1, wherein portions of the second estimated ablation volume that are closer to a surface of the second estimated ablation volume are displayed at a different opacity than portions of the second estimated ablation volume that are farther away from a surface of the second estimated ablation volume.
4. The method of claim 1, wherein the point-of-view location comprises at least one of a location of a user, an expected location of the user, or a fixed location relative to the one or more displays.
6. The system of claim 5, wherein the point-of-view location comprises at least one of a location of a user, an expected location of the user, or a fixed location relative to the one or more displays.
7. The system of claim 5, wherein the estimated ablation volume is a dynamically updated based at least in part on the real-time ablation data.
8. The system of claim 5, wherein the computer-executable instructions when executed further cause the one or more processors to:
determine a third predicted affected region based at least in part on the variance parameter, wherein the third predicted affected region is greater than the first predicted affected region and less than the second predicted affected region;
determine emplacement of the third predicted affected region with respect to the point-of-view location based at least in part on the emplacement data corresponding to the medical device; and
cause one or more displays to concurrently display a perspective view of at least a portion of the third predicted affected region based at least in part on the determined emplacement of the third predicted affected region.
9. The system of claim 5, wherein the variance parameter corresponds to a margin of error associated with said determining the estimated ablation volume.
10. The system of claim 5, wherein the variance parameter corresponds to a variance in a volume that is affected by a medical procedure corresponding to the medical device.
11. The system of claim 5, wherein the variance parameter includes a first threshold and a second threshold, the second threshold being greater than the first threshold, wherein the first threshold is used to determine the first predicted affected region and the second threshold is used to determine the second predicted affected region.
12. The system of claim 5, wherein the medical device comprises an ablation needle and the variance parameter comprises a power level of the ablation needle.
13. The system of claim 5, wherein the at least a portion of the first predicted affected region comprises at least one of a surface of the first predicted affected region, an outline of the first predicted affected region, alternating horizontal bands of differing opacity, alternating vertical bands of differing opacity, alternating tiles of differing opacity, at least a portion of the first predicted affected region located between an image slice and the point-of-view location, at least a portion of the first predicted affected region that is co-located with at least a portion of the image slice, or at least a portion of the first predicted affected region that is co-located with at least a portion of the virtual medical device.
14. The system of claim 5, wherein the at least a portion of the second predicted affected region comprises at least one of a surface of the second predicted affected region, an outline of the second predicted affected region, alternating horizontal bands of differing opacity, alternating vertical bands of differing opacity, alternating tiles of differing opacity, at least a portion of the second predicted affected region located between an image slice and the point-of-view location, at least a portion of the second predicted affected region that is co-located with at least a portion of the image slice, or at least a portion of the second predicted affected region that is co-located with at least a portion of the virtual medical device.
15. The system of claim 5, wherein portions of the at least a portion of the second predicted affected region that are closer to a surface of the at least a portion of the second predicted affected region are displayed at a different opacity than portions of the at least a portion of the second predicted affected region that are farther away from a surface of the at least a portion of the second predicted affected region.
16. The system of claim 5, wherein the at least a portion of the second predicted affected region corresponds to at least a portion of the second predicted affected region that is unique to the second predicted affected region with respect to the first predicted affected region.
18. The computer-readable, non-transitory storage medium of claim 17, wherein the second ablation volume corresponds to at least one of a smallest-possible ablation region of the first medical device based on the variance parameter or a largest-possible ablation volume of the first medical device based on the variance parameter.
19. The computer-readable, non-transitory storage medium of claim 17, wherein portions of the second ablation region that are closer to a surface of the second ablation region are displayed at a different opacity than portions of the second ablation region that are farther away from a surface of the second ablation volume.
20. The computer-readable, non-transitory storage medium of claim 17, wherein the point-of-view location comprises at least one of a location of a user, an expected location of the user, or a fixed location relative to the one or more displays.

The present application is a continuation of U.S. patent application Ser. No. 14/872,930, filed Oct. 1, 2015, entitled AFFECTED REGION DISPLAY ASSOCIATED WITH A MEDICAL DEVICE, which claims priority benefit to U.S. Provisional Application No. 62/059,077, filed Oct. 2, 2014, entitled ABLATION AREA VISUALIZATIONS, each of which is hereby incorporated herein by reference in its entirety. Any and all applications for which a foreign or domestic priority claim is identified in the Application Data Sheet as filed with the present application are incorporated by reference under 37 CFR 1.57 and made a part of this specification.

The systems and methods disclosed herein relate generally to computer systems facilitating medical device guidance through tissue by a medical practitioner.

Various medical device systems are available to aid a healthcare provider to guide a medical device in a patient. The medical device systems can provide various image guidance cues to aid the healthcare provider, and can also provide views of images of an imaged area and of virtual medical devices corresponding to physical medical devices.

FIG. 1 is a diagram of an embodiment of a system for image-guided medical procedures.

FIG. 2 is a diagram of an embodiment of a rendering of image guidance cues and medical display objects on a display.

FIGS. 3A, 3B, 3C, 3D, 3E, 3F, 3G, 3H, 3I, and 3J are diagrams illustrating embodiments of displayed affected regions.

FIGS. 4A, 4B, and 4C are diagrams illustrating embodiments of surface display regions.

FIGS. 5A, 5B, and 5C are diagrams illustrating embodiments of displayed affected regions, including surface display regions.

FIG. 6 is a flow diagram illustrative of an embodiment of a routine implemented by the system to display a displayed affected region.

FIG. 7 is a flow diagram illustrative of an embodiment of a routine implemented by the system to display displayed affected regions.

Implementations disclosed herein provide systems, methods, and apparatus for generating images facilitating medical device insertion into tissue by an operator. Certain embodiments pertain to a free-hand medical device guidance system. The system can provide the healthcare provider manual control over the medical device, while making the spatial relationships between the target, medical device and U/S image more intuitive via a visual display. Using this visual feedback, the operator can adjust the medical device's position, orientation, or trajectory. Certain of the contemplated embodiments can be used in conjunction with systems described in greater detail in U.S. patent application Ser. No. 13/014,587, filed Jan. 26, 2011, entitled SYSTEMS, METHODS, APPARATUSES, AND COMPUTER-READABLE MEDIA FOR IMAGE MANAGEMENT IN IMAGE-GUIDED MEDICAL PROCEDURES and U.S. patent application Ser. No. 13/753,274, filed Jan. 29, 2013, entitled MULTIPLE MEDICAL DEVICE GUIDANCE (the '274 application), and U.S. patent application Ser. No. 14/212,933, filed Mar. 14, 2014, entitled MEDICAL DEVICE GUIDANCE, each of which is hereby incorporated by reference in its entirety.

The system can aid the healthcare provider in guiding one or more medical devices through the tissue of the patient and/or placing the medical devices, and can be used for treatment of tumors, fibroids or cysts, with bipolar radiofrequency medical device ablation, multiple microwave medical devices, electroporation, and/or electrochemotherapy systems. It can also be used for nerve or muscle stimulation or sensing (electrodes in the spine, brain). The system can be used during open surgery, laparoscopic surgery, endoscopic procedures, biopsies, and/or interventional radiology procedures.

The system can be used in conjunction with live intraoperative ultrasound (U/S), pre-operative CT, or any cross-sectional medical imaging modality (e.g. MRI, OCT, etc.). In addition, the system can use a variety of techniques to determine the position and/or orientation of one or more medical devices. For example, the system can use the NDI Aurora magnetic system, NDI Polaris optical system, etc. In some embodiments, a position sensor can be embedded inside, or affixed to each medical device, at the tip, along the shaft, and/or on the handle. Sensors can be built into the medical devices or attached after manufacturing, as described in greater detail in U.S. application Ser. No. 14/212,184, filed Mar. 14, 2014, entitled SENSOR MOUNT, incorporated herein in its entirety.

Each medical device can be associated with one or more sensors, which can continually, or repeatedly, report position and/or orientation, or a single sensor can be used for all the medical devices. In embodiments where one sensor is used, the healthcare provider can attach the sensor to the particular medical device that she is intentionally repositioning, and then, once she has placed that medical device, she can remove the sensor and attach it to the next medical device she is repositioning. In some embodiments, the medical devices can be manipulated by the healthcare provider. In certain embodiments, the system can be used with a robotic manipulator, where the robot controls the medical devices.

In some embodiments, the handles of medical devices can have push-button switches, to allow the user to select a medical device, indicate a tissue target, etc. The handle can also have an indicator light to indicate to the users which medical device is selected. Finally, the handle can have an encoder to detect how much length of electrode has been exposed by the user, and report this information to the guidance system and therapeutic generator

Image Guidance Systems

FIG. 1 is a diagram illustrating an embodiment of a system for image management in image-guided medical procedures. In some embodiments, the position sensing unit 140 can track surgical instruments, also referred to herein as medical devices, within a tracking area and provide data to the image guidance unit 130. The medical devices can include invasive medical devices, such as, but not limited to, biopsy needles, ablation needles, surgical needles, nerve-block needles, or other needles, electrocautery device, catheters, stents, laparoscopes or laparoscopic cameras, ultrasound transducers, or other instruments that enter a part of the body, and non-invasive medical devices that do not enter the body, such as, but not limited to, ultrasound transducers, probes, or other external imaging devices, etc. The medical devices can also include medical imaging devices that provide or aid in the selection of medical images for display. In some embodiments, the medical imaging device can be any device that is used to select a particular medical image for display. The medical imaging devices can include invasive medical devices, such as laparoscopic cameras, and non-invasive medical devices, such as external ultrasound transducers.

Although only two surgical instruments 145 and 155 are shown in FIG. 1, it will be understood that additional surgical instruments can be tracked and associated data can be provided to the image guidance unit 130. The image guidance unit 130 can process or combine the data and show image guidance data on display 120. This image guidance data can be used by a healthcare provider to guide a procedure and improve care. There are numerous other possible embodiments of system 100. For example, many of the depicted components can be joined together to form a single component and can be implemented in a single computer or machine. Further, additional position sensing units can be used in conjunction with position sensing unit 140 to track all relevant surgical instruments 145 and 155, as discussed in more detail below. Additional imaging units 150 can be included, and combined imaging data from the multiple imaging units 150 can be processed by image guidance unit 130 and shown on display unit 120. Additionally, two or more surgical systems 149 can also be included.

Information about and from multiple surgical systems 149 and attached surgical instruments 145 (and additional surgical instruments not shown) can be processed by image guidance unit 130 and shown on display 120. These and other possible embodiments are discussed in more detail below. It will be understood that any combination of the display objects, image guidance cues, etc., described herein can be displayed concurrently, or simultaneously. Further, reference to displaying objects “concurrently” and/or “simultaneously” is to be interpreted broadly and may refer to displaying objects in such a way that to a human observer the objects are visible at the same time.

Imaging unit 150 can be coupled to image guidance unit 130. In some embodiments, imaging unit 150 can be coupled to a second display unit (not shown). The second display unit can display imaging data from imaging unit 150. The imaging data displayed on display unit 120 and displayed on second display unit can be the same or different. In some embodiments, the imaging unit 150 is an ultrasound machine 150, the movable imaging device 155 is an ultrasound transducer 155 or ultrasound probe 155, and the second display unit is a display associated with the ultrasound machine 150 that displays the ultrasound images from the ultrasound machine 150. In some embodiments, a movable imaging unit 155 can be connected to image guidance unit 130. The movable imaging unit 155 can be useful for allowing a user to indicate what portions of a first set of imaging data are to be displayed. For example, the movable imaging unit 155 can be an ultrasound transducer 155, a needle or other medical device, for example, and can be used by a user to indicate what portions of imaging data, such as a pre-operative CT scan, to show on a display unit 120 as image 125. Further, in some embodiments, there can be a third set of pre-operative imaging data that can be displayed with the first set of imaging data.

In some embodiments, system 100 comprises a display unit 120 and a position sensing unit 140 communicatively coupled to image guidance unit 130. In some embodiments, position sensing unit 140, display unit 120, and image guidance unit 130 are coupled to the stand 170. Image guidance unit 130 can be used to produce images 125 that are displayed on display unit 120. The images 125 produced on display unit 120 by the image guidance unit 130 can be determined based on ultrasound or other visual images from the first surgical instrument 145 and second surgical instrument 155. In the illustrated embodiment, the images 125 includes a 2D viewing area and a 3D viewing area. The 2D viewing area includes a 2D view of each of an ultrasound slice 121, a virtual medical device 122 corresponding to the first surgical instrument 145, a virtual imaging device 123 corresponding to the second surgical instrument 155, surface display regions 124a, 124b, intersection indicator 126, and trajectory and other image guidance cues 127. In the illustrated embodiment, the 3D viewing area includes perspective views of each of the image slice 121, the virtual medical device 122, a displayed affected region 129 including the surface display regions 124a, 124b, the virtual imaging device 123, intersection indicator 126, trajectory and other image guidance cues 127, and a patient orientation indicator 128. It will be understood that any combination of the aforementioned display objects can be displayed in the 2D view and/or 3D view as desired.

As a non-limiting example, if the first surgical instrument 145 is an ablation needle 145 and the second surgical instrument 155 is an ultrasound probe 155, then images 125 produced on display 120 can include the images, or video, from the ultrasound probe 155 (e.g., image slice 121) combined with other medical display objects and image guidance cues, such as projected medical device drive (e.g., trajectory indicators 127) or projected ablation volume (e.g., displayed affected region 129), determined based on the emplacement of ablation needle 145. If the first surgical instrument 145 is an ultrasound probe 145 and the second surgical instrument 155 is a laparoscopic camera 155, then images 125 produced on display 120 can include the video from the laparoscopic camera 155 combined with ultrasound data superimposed on the laparoscopic image. More surgical instruments can be added to the system. For example, the system can include an ultrasound probe, ablation needle, laparoscopic camera, stapler, cauterizer, scalpel and/or any other surgical instrument or medical device. The system can also process and/or display collected data, such as preoperative CT scans, X-Rays, MRIs, laser scanned 3D surfaces etc.

The term “emplacement” as used herein is a broad term and may refer to, without limitation, position and/or orientation or any other appropriate location information. The term “pose” as used herein is a broad term encompassing its plain and ordinary meaning and may refer to, without limitation, position and orientation or any other appropriate location information. In some embodiments, the imaging data obtained from one or both of surgical instruments 145 and 155 can include other modalities such as a CT scan, MRI, open-magnet MRI, optical coherence tomography (“OCT”), positron emission tomography (“PET”) scans, fluoroscopy, ultrasound, or other preoperative, or intraoperative 2D or 3D anatomical imaging data. In some embodiments, surgical instruments 145 and 155 can also be scalpels, implantable hardware, or any other device used in surgery. Any appropriate surgical system 149 or imaging unit 150 can be attached to the corresponding medical instruments 145 and 155.

As noted above, images 125 produced can also be generated based on live, intraoperative, or real-time data obtained using the second surgical instrument 155, which is coupled to second imaging unit 150. The term “real time” as used herein is a broad term and has its ordinary and customary meaning, including without limitation instantaneously or nearly instantaneously. The use of the term real time can also mean that actions are performed or data is obtained with the intention to be used immediately, upon the next cycle of a system or control loop, or any other appropriate meaning. Additionally, as used herein, real-time data can be data that is obtained at a frequency that would allow a healthcare provider to meaningfully interact with the data during surgery. For example, in some embodiments, real-time data can be a medical image of a patient that is updated one time per second. In some embodiments, real-time data can be ultrasound data that is updated multiple times per second.

The surgical instruments 145, 155 can be communicatively coupled to the position sensing unit 140 (e.g., sensors embedded or coupled to the surgical instruments 145, 155 can be communicatively coupled with the position sensing unit 140). The position sensing unit 140 can be part of imaging unit 150 or it can be separate. The position sensing unit 140 can be used to determine the emplacement of first surgical instrument 145 and/or the second surgical instrument 155. In some embodiments, the position sensing unit 140 can include a magnetic tracker and/or one or more magnetic coils can be coupled to surgical instruments 145 and/or 155. In some embodiments, the position sensing unit 140 can include an optical tracker and/or one or more visually-detectable fiducials can be coupled to surgical instruments 145 and/or 155. In some embodiments, the position sensing unit 140 can be located below the patient. In such embodiments, the position sensing unit 140 can be located on or below the table 180. For example, in embodiments where the position sensing unit 140 is a magnetic tracker, it can be mounted below the surgical table 180. Such an arrangement can be useful when the tracking volume of the position sensing unit 140 is dependent on the location of the position sensing unit 140, as with many magnetic trackers. In some embodiments, magnetic tracking coils can be mounted in or on the medical devices 145 and 155.

In some embodiments, the position sensing unit 140 can be an electromagnetic measurement system (e.g., NDI Aurora system) using sensor coils for tracking units attached to the first and/or second surgical devices 145 and 155. In some embodiments, the second position sensing unit 140 can be an optical 3D tracking system using fiducials. Such optical 3D tracking systems can include the NDI Polaris Spectra, Vicra, Certus, PhaseSpace IMPULSE, Vicon MX, InterSense IS-900, NaturalPoint OptiTrack, Polhemus FastTrak, IsoTrak, or Claron MicronTracker2. In some embodiments, the position sensing unit 140 can each be an inertial 3D tracking system comprising a compass, accelerometer, tilt sensor, and/or gyro, such as the InterSense InertiaCube or the Nintendo Wii controller. In some embodiments, the position sensing unit 140 can be attached to or affixed on the corresponding surgical device 145 and 155.

In some embodiments, the position sensing units 140, can include sensing devices such as the HiBall tracking system, a GPS device, or signal emitting device that would allow for tracking of the position and/or orientation (e.g., emplacement) of the tracking unit (also referred to as an emplacement sensor). In some embodiments, a position sensing unit 140 can be affixed to either or both of the surgical devices 145 and 155. The surgical devices 145 or 155 can be tracked by the position sensing unit 140. A room coordinate system reference, such as the display 120 can also be tracked by the position sensing unit 140 in order to determine the emplacements of the surgical devices 145 and 155 with respect to the room coordinate system. Devices 145 and 155 can also include or have coupled thereto one or more accelerometers, which can be used to estimate movement, position, and location of the devices.

In some embodiments, the position sensing unit 140 can be an Ascension Flock of Birds, Nest of Birds, driveBAY, medSAFE, trakSTAR, miniBIRD, MotionSTAR, pciBIRD, or Calypso 2D Localization System and tracking units attached to the first and/or second medical devices 145 and 155 can be magnetic tracking coils.

The term “tracking unit” (also referred to as an emplacement sensor), as used herein, is a broad term encompassing its plain and ordinary meaning and includes without limitation all types of magnetic coils or other magnetic field sensing devices for use with magnetic trackers, fiducials or other optically detectable markers for use with optical trackers, such as those discussed above and below. In some embodiments, the tracking units can be implemented using optical position sensing devices, such as the HiBall tracking system and the position sensing unit 140 can form part of the HiBall tracking system. Tracking units can also include a GPS device or signal emitting device that allows for tracking of the position and/or orientation of the tracking unit. In some embodiments, a signal emitting device might include a radio-frequency identifier (RFID). In such embodiments, the position sensing unit 140 can use the GPS coordinates of the tracking units or can, for example, triangulate the radio frequency signal being emitted by the RFID associated with tracking units. The tracking systems can also include one or more 3D mice.

Images 125 can be produced based on intraoperative or real-time data obtained using first surgical instrument 145, which is coupled to first surgical system 149. In the illustrated embodiment of FIG. 1, the first surgical system 149 is shown as coupled to image guidance unit 130. The coupling between the first surgical system 149 and image guidance unit 130 may not be present in all embodiments. In some embodiments, the coupling between first surgical system 149 and image guidance unit 130 can be included where information about first surgical instrument 145 available to first surgical system 149 is useful for the processing performed by image guidance unit 130. For example, in some embodiments, the first surgical instrument 145 is an ablation needle 145 and first surgical system 149 is an ablation system 149. In some embodiments, it can be useful to send a signal about the relative strength of planned ablation from ablation system 149 to image guidance unit 130 in order that image guidance unit 130 can show a predicted ablation volume. In other embodiments, the first surgical system 149 is not coupled to image guidance unit 130. Example embodiments including images and graphics that can be displayed are included below.

In some embodiments, the display unit 120 displays 3D images to a user, such as a healthcare provider. Stereoscopic 3D displays separate the imagery shown to each of the user's eyes. This can be accomplished by a stereoscopic display, a lenticular auto-stereoscopic display, or any other appropriate type of display. The display 120 can be an alternating row or alternating column display. Example alternating row displays include the Miracube G240S, as well as Zalman Trimon Monitors. Alternating column displays include devices manufactured by Sharp, as well as many “auto-stereoscopic” displays (e.g., Philips). In some embodiments, Sony Panasonic 3D passive displays and LG, Samsung, and/or Vizio 3D TVs can be used as well. Display 120 can also be a cathode ray tube. Cathode Ray Tube (CRT) based devices, can use temporal sequencing, showing imagery for the left and right eye in temporal sequential alternation. This method can also be used projection-based devices, as well as by liquid crystal display (LCD) devices, light emitting diode (LED) devices, and/or organic LED (OLED) devices.

In certain embodiments, the display unit 120 can be a head mounted display worn by the user in order to receive 3D images from the image guidance unit 130. In such embodiments, a separate display, such as the pictured display unit 120, can be omitted. The 3D graphics can be produced using underlying data models, stored in the image guidance unit 130 and projected onto one or more 2D planes in order to create left and right eye images for a head mount, lenticular, or other 3D display. The underlying 3D model can be updated based on the relative emplacements of the various devices 145 and 155, as determined by the position sensing unit(s) 140, and/or based on new data associated with the devices 145 and 155. For example, if the second medical device 155 is an ultrasound probe, then the underlying data model can be updated to reflect the most recent ultrasound image. If the first medical device 145 is an ablation needle, then the underlying model can be updated to reflect any changes related to the needle, such as power or duration information. Any appropriate 3D graphics processing can be used for rendering including processing based on OpenGL, Direct3D, Java 3D, etc. Whole, partial, or modified 3D graphics packages can also be used, such packages including 3DS Max, SolidWorks, Maya, Form Z, Cybermotion 3D, VTK, Slicer, or any others. In some embodiments, various parts of the needed rendering can occur on traditional or specialized graphics hardware. The rendering can also occur on the general CPU, on programmable hardware, on a separate processor, be distributed over multiple processors, over multiple dedicated graphics cards, or using any other appropriate combination of hardware or technique.

One or more components, units, devices, or elements of various embodiments can be packaged and/or distributed as part of a kit. For example, in one embodiment, an ablation needle, one or more tracking units, 3D viewing glasses, and/or a portion of an ultrasound wand can form a kit. Other embodiments can have different elements or combinations of elements grouped and/or packaged together. Kits can be sold or distributed separately from or with the other portions of the system.

One will readily recognize that there are numerous other examples of image guidance systems which can use, incorporate, support, or provide for the techniques, methods, processes, and systems described herein.

Depicting Surgical Instruments

It can often be difficult to discern the content of a 3D scene from a 2D depiction of it, or even from a 3D depiction of it. Therefore, various embodiments herein provide image guidance that can help the healthcare provider better understand the scene, relative emplacements or poses of object in the scene and thereby provide improved image guidance.

FIG. 2 illustrates a perspective view of a virtual rendering 202 of a surgical instrument 242 being displayed on a screen 220 with a perspective view of a medical image 204. In some embodiments, the screen 220 can correspond to the screen of a display unit 120, which can be implemented using a TV, computer screen, head-mounted display, projector, etc. In the illustrated embodiment, the rendered surgical instrument 202 displayed on the screen 220 corresponds to the ablation needle 242. A wire 246 connecting the ablation needle 242 to an ablation system is also depicted in FIG. 2.

Although only one virtual surgical instrument 202 is displayed, it will be understood that multiple medical devices can be tracked and displayed concurrently, or simultaneously, on screen 220, as described in greater detail in the '274 application, previously incorporated by reference. For example, a virtual rendering of the medical imaging device 222 can be displayed.

The virtual surgical instrument 202 can be displayed in a virtual 3D space with the screen 220 acting as a window into the virtual 3D space. Thus, as the surgical instrument 242 is moved to the right with respect to a point-of-view location (e.g., the location of the point-of-view for viewing the 3D space), the virtual surgical instrument 202 also moves to the right. Similarly, if the surgical instrument 242 is rotated 90 degrees so that the tip of the surgical instrument is pointing away from the point-of-view location (e.g., at the screen 220), the virtual surgical instrument 201 will likewise show the change in orientation, and show the tip of the virtual surgical instrument 202 in the background and the other end of the virtual surgical instrument 202 in the foreground. In some embodiments, as described in greater detail in U.S. application Ser. No. 14/212,933, incorporated herein by reference in its entirety, the point-of-view location can be a fixed location, such as a predetermined distance/angle from the screen 220 or stand 170 and or a location configured by the user; or the point-of-view location can by dynamic. For example, the system can track a user in real-time and determine the point-of-view location based at least in part on the tracked location of the user.

Some models of medical devices have markings such as bands around the shaft (to indicate distance along the shaft), and a colored region 203 near the tip to indicate from where the radio frequency or microwave energy is emitted in the case of an ablation probe. Healthcare providers performing medical device procedures are often familiar with these markings and can use them to help understand the spatial relationship between the medical device and anatomy. In some embodiments, the make and model of the medical device 242 is known to the image guidance system and the virtual medical device 202 displayed in display 220 can resemble medical device 242. The features of medical devices that can be rendered in the scene include the overall shape (diameter, cross sectional shape, curvature, etc.), color, distance markers, visuals or echogenic fiduciary markers, the state of deployable elements such as tines, paddles, anchors, resection loops, stiffening or steerable sleeves, temperature, radiation, light or magnetic field sensors, lens, waveguides, fluid transfer channels, and the like.

The type of medical device being used can be input into the image guidance system 100, can be a system default, can be detected by a camera or other device, can be received as data from an attached medical device, such as surgical system 149 in FIG. 1, or the information can be received in any other appropriate manner. Displaying on display 220, a virtual surgical instrument that resembled the surgical instrument 242 can help healthcare providers associate the image guidance data with the real world and can provide more familiar guidance information to a healthcare provider, thereby further aiding the healthcare provider in the guidance task. For example, the healthcare provider can see the familiar markings on the medical device being displayed on the display 220 and therefore be familiar with the distance and relative placement of the displayed medical device with respect to other data, such as a tumor 212 seen in a rendered ultrasound image 204, 205. This knowledge of relative placement of items being displayed can help the healthcare provider move the medical device into place.

Consider an embodiment in which the virtual surgical instrument 202 in the display 220 is an ablation needle depicting the portion of the needle that will perform the ablation, for example, the portion that emits the radio or microwave energy. If the display 220 also includes ultrasound data, then the doctor can be able to find the tumor 212 she wishes to ablate by moving the ultrasound probe around until she spots the tumor 212. In various embodiments, she will be able to see the displayed ultrasound data and its location relative to the displayed medical device with the markings. She can then drive the medical device until she sees, on display 220, that the emitter-portion of the medical device encompasses the tumor in the ultrasound, also seen on display 220. When she activates the ablation, she can then be much more certain that she has ablated the correct portion of the tissue. Various embodiments of this are discussed below.

As another example, consider the physical markings that can be on the instruments themselves. These markings can help orient a healthcare provider during use of the instrument. In some embodiments, the image guidance unit can represent these markings in the images displayed in the display. For example, certain ultrasound transducers are built with an orientation mark (e.g., a small bump) on one side of the transducing array. That mark can also be shown in the ultrasound image on the scanner's display, to help the healthcare provider understand where the scanned anatomical structures shown on screen are located under the transducer, inside the patient. In some embodiments, the image guidance system can display a symbolic 3D representation of the orientation mark both next to the motion-tracked ultrasound slice (e.g., moving with the displayed ultrasound slice) and next to the 2D view of the ultrasound slice also displayed by the system. An example of this is displayed in FIG. 2, where a small rectilinear volume 214 corresponding to a feature on an ultrasound probe is shown both in proximity to the ultrasound slice displayed in the 3D view and the ultrasound slice displayed in a 2D view.

It will be understood that an image slice can correspond to image data received from an imaging device, such as an ultrasound transponder. In some embodiments, the image data can correspond to a cross-section of tissue having a certain thickness. In some instances, the imaging device can compact the image data, and/or treat the image data as 2D data, such that there is no perceived thickness. In certain embodiments, when the image slice is displayed in a 3D view, the system can treat the image slice as a 2D or quasi 2D object. In such embodiments, the system can cause the image slice to have little to no perceptible thickness. Accordingly, in certain embodiments, when the image slice is oriented orthogonally or perpendicularly with respect to the point-of-view location, the system can cause the display to display nothing or a line having a relatively small thickness, such as a few pixels, etc. In some cases, the number of pixels used to display the relatively small thickness of the image slice can correspond to the size of the display. For example, more pixels can be used for a larger display and fewer pixels can be used for a smaller display, etc.

Other embodiments can track and display other types of instruments and their features. For example, a healthcare provider may want to track one or more of a scalpel, a biopsy, a cauterizer (including an electrocauterizer and Bovies), forceps, cutting loops on hysteroscopes, harmonic sheers, lasers (including CO2 lasers), etc. For example, in various embodiments, the following devices can be tracked and various aspects of their design displayed on display 220: Olympus™ OES Pro Hystero-Resectoscope, SonoSurg Ultrasonic Surgical System Olympus™ GF-UC 160 Endoscope Wallus™ Embryo Transfer Catheter AngioDynamics® NanoKnife™, VenaCure™ laser, StarBurst, Uniblade, Habib® Resector Bovie™ Electrodes, Covidien Evident™, Cool-Tip™ Ablation Antennas, Opti4™ Electrodes Microsulis MEA (microwave endometrial ablation), Acculis Halt™ Medical System Optimed BigLumen Aspiration Catheter Optimed Optipure Stent Central venous catheterization introducer medical device (such as those made by Bard and Arrow).

Once tracked, a healthcare provider is able to see image guidance data on display 220 that will allow her to know the relative pose, location, or emplacement of the tracked instrument(s) with respect to one another or with respect to imaging data and will be able to see, on display 220, the features of the instrument rendered in the scene.

Depicting Medical Device Placement, Trajectory, and Other Image Guidance Cues

In certain procedures, the system can provide image prediction information related to the surgical instruments as image guidance cues. In the context of scalpel movement, this can be the location that the scalpel will hit if a healthcare provider continues to move the scalpel in a particular direction. In the context of ablation or biopsies, this can be the projected medical device placement if it is driven along its central axis, which is also referred to herein as a longitudinal axis.

FIG. 2 further illustrates an embodiment of a projected needle drive 208 (also referred to as a trajectory indicator) as an image guidance cue. If a healthcare provider is driving an ablation needle 242 into tissue (not pictured), then she can know where the medical device will be driven. In some embodiments, the projected drive 208 of a medical device can be depicted on the display 220 and can show the healthcare provider the projected path 208 that the medical device 242 will take if it is driven along its central axis. Although the trajectory of only one medical device is displayed, it will be understood that the trajectory of multiple medical devices can be determined and displayed simultaneously on screen 220, as described in greater detail in the '274 application.

In some embodiments, to implement the trajectory indicators 208, the image guidance system can draw a number of rings about the axis of the medical device shaft, extrapolated beyond its tip, as depicted in FIG. 4. A healthcare provider can view and manipulate the emplacement of the medical device 242 and its expected drive projection (via its displayed projected trajectory) before it enters the patients tissue. In some embodiments, this is accomplished by the doctor positioning the virtual rings in the drive projection such that they are co-incident (or pass through) the ultrasound representation of a target, such as a tumor that the doctor has spotted in the ultrasound. This can allow the healthcare provider to verify that the medical device 242 is properly aimed at the target and can drive the medical device 242 forward into the tissue such that it reaches its desired target or destination. For example, if the doctor identifies a tumor 212 in the ultrasound image, she can align the ablation needle 242 such that the drive projection rings on display 220 intersect or otherwise indicate that the medical device, if driven straight, will reach the tumor 212.

The rings can, in some embodiments, be spaced at regular (e.g., 0.5, 1, or 2 cm) intervals to provide the healthcare provider with visual or guidance cues regarding the distance from the medical device tip to the targeted anatomy. In some embodiments, the spacing of the rings can indicate other aspects of the data, such as the drive speed of the medical device, the density of the tissue, the distance to a landmark, such as the ultrasound data, or any other appropriate guidance data or property. In some embodiments, the rings or other trajectory indicators can extend beyond the medical device tip, by a distance equal to the length of the medical device-shaft. This way, the user knows if the medical device is long enough to reach the target—even before the tip enters the patient. That is, in some embodiments, if the rings do not reach the target with the tip still outside the body, then the tip will not reach the target even when the entire length shaft is inserted into the body.

Other display markers can be used to show trajectory, such as a dashed, dotted, or solid line, transparent medical device shaft, point cloud, wire frame, etc. In some embodiments, three-dimensional rings can be used and provide depth cues and obscure little of the ultrasound image. Virtual rings or other virtual markers can be displayed semi-transparently, so that they obscure less of the ultrasound image than an opaque marker would.

Other prediction information can also be displayed as image guidance cues. For example, if a scalpel is being tracked by the image guidance system, then a cutting plane corresponding to the scalpel can be displayed (not pictured). Such a cutting plan can be coplanar with the blade of the scalpel and can project from the blade of the scalpel. For example, the projected cutting plane can show where the scalpel would cut if the doctor were to advance the scalpel. Similar prediction information can be estimable or determinable for cauterizers, lasers, and numerous other surgical instruments.

Furthermore, the data from two or more devices can be combined and displayed based on their relative emplacements or poses. For example, the system 100 can determine the emplacement of an image plane based on the emplacement information of the ultrasound probe 222. Further, the rendered ultrasound image 204 can be displayed on the image plane with respect to the virtual medical device 202 on the display 220 in a manner that estimates the relative emplacements or poses of an ultrasound probe 222 and the medical device 242. As illustrated in FIG. 2, the image guidance cues associated with the virtual medical device 202, including the affected region indicator 206 and trajectory indicators 208, are shown spatially located with the rendered ultrasound image 204 on display 220.

In addition, the display 220 can include another image guidance cue in the form of an intersection indicator 210 that indicates where the virtual ablation medical device 202 (and/or its trajectory) intersects the ultrasound image 204. In some embodiments, the intersection indicator 210 can be displayed before the medical device is inserted, thereby allowing the healthcare provider to see where the medical device will intersect the image, or imaged area.

In the illustrated embodiment, a tumor 212 appears in the ultrasound image, or rendered ultrasound image 204, and the virtual ablation needle 202 is shown driven through the tumor 212. As will be described in greater detail below, the displayed affected region (or affected region indicator) 206 can indicate what region or volume would be affected when the medical device 242 is operated. In the illustrated embodiment, the displayed affected region 206 can estimate where ablation would occur if the tissue were ablated at that time. As can be seen, in the illustrated embodiment, the displayed affected region 206 appears to cover the tumor displayed in the ultrasound image.

Various embodiments can include any combinations of the graphics described above and/or other graphics or image guidance cues. For example, in some embodiments, data related to a single surgical instrument (such as an ablation needle, ultrasound probe, etc.) can be presented in more than one manner on a single display. Consider an embodiment in which device 242 is an ablation needle and device 222 is an ultrasound transducer. As mentioned previously, as the medical devices are displayed in a virtual 3D space, with the screen 220 acting as a window into the virtual 3D space, if a healthcare provider orients ultrasound transducer 222 such that it is perpendicular to the point-of-view or point-of-view location (e.g., perpendicular to the screen), the perspective view of the ultrasound image 204 would show only the edge and the contents of the ultrasound image 204 would not be visible. In some embodiments, the image guidance system can track the healthcare provider's head using an emplacement sensor and/or a position sensing unit. In some embodiments, such as, when the head of a user is tracked, the healthcare provider can then move her head to the side, so that she sees the ultrasound image from a different point of view location.

In some embodiments, the image guidance system can constantly display an additional 2D view 205 of the ultrasound image, simultaneous to the 3D depiction 204, so that the ultrasound image is always visible, regardless of the emplacement in which the healthcare provider holds the transducer 222. The 2D view 205 of the ultrasound data can be similar to what a healthcare provider is accustomed to seeing with traditional ultrasound displays. This can be useful to provide the healthcare provider with imaging to which she is accustomed and allows a healthcare provider to see the ultrasound data regardless of the then-current emplacement of the ultrasound probe with respect to the user.

In some embodiments, the 2D view 205 of an ultrasound image is depicted in the upper right corner of the monitor (though it can be placed in any location). In some embodiments, the guidance system can automatically (and continually) choose a corner in which to render the 2D view 205 of the ultrasound image, based on the 3D position of the surgical instruments in the rendered scene. For example, in FIG. 2, ablation needle 242 can be held in the healthcare provider's left hand and the medical device shaft is to the left of the 3D view of the ultrasound image slice, so that the 2D view 202 of the ultrasound image in the upper right corner of display 220 does not cover any of the 3D features of the medical device (or vice-versa). If the medical device were held in the healthcare provider's right hand, the virtual medical device shaft would appear on the right side. To prevent the 2D view 205 in the corner of display 220 from covering the medical device shaft, the system can automatically move it to a corner that would not otherwise be occupied by graphics or data.

In some embodiments, the system attempts to avoid having the 2D view 205 of the ultrasound image quickly moving among corners of the display in order to avoid overlapping with graphics and data in the display. For example, a function f can be used to determine which corner is most suitable for the 2D ultrasound image to be drawn in. The inputs to f can include the locations, in the screen coordinate system, of the displayed medical device tip, the corners of the 3D view of the ultrasound image, etc. In some embodiments, f's output for any given point in time is independent of f's output in the previous frames, which can cause the ultrasound image to move among corners of the display rapidly. In some embodiments, the image guidance system will filter f's output over time. For example, the output of a filter g, for any given frame, could be the corner, which has been output by f the most number of times over the last n frames, possibly weighting the most recent values for f most heavily. The output of the filter g can be used to determine in which corner of display 220 to display the 2D ultrasound image and the temporal filtering provided by g can allow the 2D view 205 of the ultrasound image display to move more smoothly among the corners of the display 220.

In some embodiments, other appropriate virtual information and/or image guidance cues can be overlaid on the 2D view 205 of the ultrasound image as well as the 3D view 204. Examples include: orientation indicator 214, an indication of the distance between the medical device's tip and the point in the plane of the ultrasound image that is closest to the medical device tip; the cross section or outline of the ablation volume that intersects with the ultrasound slice; and/or the intersection point, box, outline, etc. between the medical device's axis and the ultrasound image plane.

Furthermore, it will be understood that other image guidance cues can be generated and displayed on the display as described in greater detail in the '274 application, previously incorporated herein by reference. For example, the system 100 can generate and/or display graphical indicators that help indicate the spatial relationship between a medical device and an ultrasound image plane (e.g., graphical image plane indicators) or other plane (e.g., graphical plane indicators), indicators to indicate the relative positions of the medical device(s) and ultrasound image, features of interest, annotations, foundational plane indicators, foundational plane intersection indicators, other graphical indicators, approximate medical device location indicators, etc. As described in greater detail above and in the '274 Application, the various image guidance cues can be generated based at least in part on the emplacement information of the medical devices used with the system 100.

Depicting Affected Region and Other Information

Embodiments of the system can include image guidance cues as part of the image guidance data to depict information related to the region or regions that will be affected by the use of surgical instruments. For example, in some embodiments, an image guidance cue displayed by the image guidance system can include affected region information. The illustrated embodiment of FIG. 2 shows a virtual ablation needle 202, which has a darkened portion 203 that indicates where the radio frequency or microwave energy for ablation will be emitted, and a displayed affected region 206 showing the volume that will be, or is being, ablated.

In some embodiments, the system can use the operating parameters of the medical device 242 and/or measured parameters to determine the affected region (and display the displayed affected region 206). For example, in some cases, the affected region's approximate size (e.g., girth and length) can be either specified by the healthcare provider, or automatically computed by the guidance system based on or more operating parameters, such as, but not limited to, the medical device make and model, power and duration settings of the medical device (e.g., microwave or radio frequency generator for ablation needles, etc.), and the like. Similarly, the system can use measured parameters to determine the affected region, such as, but not limited to, measured or estimated temperature, impedance of surrounding tissue. In some embodiments, the measured parameters can be received in real-time as real-time data. In either case, the system can use one or more a formulas, a look-up-tables, fixed or default values, or any other appropriate available information, etc. to determine the affected region.

In addition, the system can determine affected regions prior to operating the medical device and/or during operation of the medical device. For example, prior to operating the medical device, the system can determine one or more predicted affected regions and/or during operation of the medical device, the system can determine one or more dynamic affected regions. In some embodiments, the predicted affected regions can be static during operation of the medical device and the dynamic affected regions can change over time. In certain embodiments, the system may rely more on operating parameters of the medical device to determine the predicted affected regions and measured parameters to determine the dynamic affected regions. However, it will be understood that operating parameters and/or measured parameters can be used to determine the predicted affected regions and/or the dynamic affected regions

In some circumstances, the operating parameters, measured parameters, formulas, a look-up-tables, fixed or default values, or other information used to determine the affected regions may include some amount of error or variance. The variance may be due to uncertainty regarding the tissue that will be presented for ablation, a manufacturers indication that impedance, temperature, power, etc., can vary between tissue and/or medical devices, etc.

As such, it can be difficult to determine the affected region with certainty. Thus, the operating parameters or other data can include one or more variance parameters indicating the amount of variance that a healthcare provider can expect when using a particular medical device. The variance parameter may account for all possible outcomes or a significant portion of possible outcomes (non-limiting examples: 95% or 99%). For example, the variance parameter can indicate that a medical device operates within a certain range, or that a healthcare provider can expect a certain volume to be affected with a particular standard deviation and/or +/− some percent. For example, the variance parameter may indicate that, when operating for a particular amount of time, an ablation needle will ablate a certain range of tissue, or that a certain amount of tissue will be ablated with a particular standard deviation and/or +/− some percent.

Accordingly, in such scenarios, the system can determine multiple affected regions based at least in part on the variance parameter. For example, the system can determine two affected regions using the extrema of the variance parameter. In some cases, such as when the variance parameter includes a lower threshold and a higher threshold, the system can determine two affected regions using the lower threshold and the higher threshold, respectively. The affected regions can be predicted affected regions and/or dynamic affected regions depending on when and how the system determines them. In some cases, a third affected regions can be determined. The third affected region can be determined using a third point in the range, such as the midpoint, average, or other point.

The system can also determine the emplacement of the affected region. In some cases, the emplacement of the affected region can be based at least in part on the emplacement of some or all of the corresponding medical device (or virtual medical device), such as medical device 242 in FIG. 2. For example, the system can receive emplacement data from one or more emplacement sensors associated with the medical device 242 (non-limiting examples: coupled to or integrated with the medical device 242, within an optical path of the medical device, etc.). The system can use the emplacement data to determine the emplacement of the tracked medical device and/or the emplacement of the virtual medical device 202 corresponding to the medical device 242. In some instances, the system can determine the emplacement of the medical device 242 and/or virtual medical device 202 with respect to a point-of-view location.

As yet another example, if the medical device is an ablation needle and the affected region is an ablation volume, the emplacement of the ablation volume can be based at least in part on the emplacement of the ablation needle (or its rendered version) or at least a portion of it, such as the location on the ablation needle where the ablation energy will be emitted. Specifically, in some embodiments, the affected region can be centered at a location on the medical device, such as the location on the medical device that affects the surrounding tissue (non-limiting examples: microwave emitter, laser source output, etc.). Similarly, if multiple medical devices are used, the posed can be based at least in part on the emplacement of the medical devices.

The system can display the affected region in a variety of ways. Furthermore, although the illustrated embodiment of FIG. 2 includes only one displayed affected region 206, it will be understood that one or more affected regions can be displayed corresponding to each medical device 242 that is displayed on the screen 220 and/or multiple affected regions can be displayed corresponding to a single medical device 242. In some embodiments, the system can display a perspective view of the affected region and/or non-perspective view, such as by displaying the affected region on or with the ultrasound image displayed in the 2D view. Some or all of the affected regions can be displayed as desired. In some embodiments, the portion of the affected region that is displayed can be referred to as the displayed affected region and/or the surface display region.

For some medical devices, the expected volume of ablated tissue is neither spherical nor centered at the tip of the medical device. Accordingly, in such embodiments, the affected regions can match expected volumes. For example, a Covidien surgical microwave medical device has an ellipsoidal ablation volume; a Covidien Evident transcutaneous microwave medical device has a teardrop-like ablation volume; RFA Medical's bipolar ablation system uses two medical devices simultaneously, where each medical device has paddles that deploy after the medical device is inserted inside the tissue (which one can equate to a canoe's oar). In some embodiments, the affected region for such a medical device corresponds to a volume that lies directly between the paddles of the two medical devices.

Although the illustrated embodiment of FIG. 2 refers to the affected region as an ablation volume, it will be understood that the affected region can correspond to a variety of medical procedures. For example, if a cauterizer is tracked as part of an image guidance system, then the affected region can correspond to a cauterization volume. If a laser is tracked as part of the image guidance system, then the affected region can correspond to the projected laser path. Similarly, the affected region can correspond to a biopsy volume, an electroporated volume, cryoablation volume, laser ablation volume, high-frequency focused ultrasound ablation (HIFU) volume, external beam radiation therapy volume, and drilling volume (where the display volume corresponds to the region of bone and other tissue that the manually operated, or computer-controlled drill would remove), depending on the type of medical instrument being used.

Example Displayed Affected Regions

As non-limiting examples and with reference to FIGS. 3A-3J, 4A-4C, and 5A-5C, the system can display the affected regions as a transparent volume, a wireframe volume (as depicted in FIG. 2), a volume with varying opacity, a point cloud of various densities, a surface, an outline, or any portion thereof. As mentioned above, the affected regions can correspond to predicted affected regions and/or dynamic affected regions, as desired.

When displaying a portion of a volume, the system can display the portions of the volume that are located in front of the image slice with respect to the point-of-view location (or display them differently than portions that are behind the image slice), alternating bands or tiles of the affected region, portions of the affected region that are co-located with or intersect the medical device and/or the image slice. In addition, when multiple affected regions are determined, the system can display them in any combination as described above. Further, in some embodiments, such as when the system determines a second affected region that includes a first affected region, the displayed affected region can include the portions of the second affected region that are unique second affected region with respect to the first affected region only, or in combination with other portions. In certain embodiments, such as when the system determines that portions of a second affected region and a first affected region overlap, the system can display the overlapping portions only, or in combination with other portions.

Furthermore, the system can display the affected regions and other displayed features differently. For example, in some embodiments, the system can vary the characteristics of the affected regions (non-limiting examples: portions closer to the outline or edge of the affected regions can be more/less opaque, bright or focused, distal portions of the affected regions can be more/less opaque, bright or focused). Similarly, the system can vary the characteristics of the other displayed features. In some embodiments, the system can use different display settings for different portions of an image slice. For example, the system can display portions of the image slice within a first affected region using a first setting, portions of the image slice within a second affected region using a second setting, and portions of the image slice outside the first and second affected regions using a third setting. The different settings can correspond to different opacity levels, brightness levels, contrast levels, and/or focus levels, etc. In some embodiments, portions of the image slice outside the first and second affected regions can be darkened, blurred, or otherwise adjusted to provide the healthcare provider with additional insight regarding the portions of the image slice that are within the affected region(s).

FIGS. 3A-3J are diagrams illustrating various non-limiting embodiments for displaying a perspective view of at least a portion of the affected regions as a volume. FIG. 3A illustrates an embodiment in which two affected regions are displayed as two transparent volumes 302, 304 and/or outlines along with a virtual medical device 301. FIG. 3B illustrates an embodiment in which two affected regions are displayed as two volumes 306, 308 with varying opacity (or as semi-transparent). In the illustrated embodiment, the volumes 306, 308, are more opaque near the edges, however, it will be understood that the opacity can be varied throughout the volumes 306, 308 as desired.

FIG. 3C illustrates an embodiment in which two affected regions are displayed as two volumes 310, 312 with a surface texture. FIG. 3D illustrates an embodiment in which a portion of the affected region that is located between the image slice 314 and the point-of-view location (or in front of the image slice 314) is displayed as a volume 316. It will be understood that the portion of the affected region that is located between the image slice 314 and the point-of-view location can be displayed differently from portions of the affected region that is located distally from the point-of-view location with respect to the image slice 314. For example, the different portions can be displayed with different colors, brightness, sharpness, etc. In some cases, the portions of the affected region located distally from the point-of-view location with respect to the image slice 314 can be displayed with lighter or more faded colors, etc.

FIG. 3E illustrates an embodiment in which portions of a second affected region are displayed. The portions displayed correspond to the portions of the second affected region that are unique to the second affected region with respect to the first affected region. In the illustrated embodiment, the portions are displayed as spikes 318. However, it will be understood that any shape or design can be used as desired. In some embodiments, the spikes 318 can be a predefined length and/or have marking as predefined lengths, such as 1 cm. As such, user can use the spikes to determine distances of object displayed on the screen. Although not illustrated in FIG. 3E, it will be understood that the opacity of the spikes can vary 318 as desired. In some embodiments, such as when the second affected region is a predicted affected region, as a dynamic affected region grows, the system can cause the spikes 318 to become transparent, or otherwise adjust a display setting, based at least in part on the location of the dynamic affected region with respect to the spikes 318. FIG. 3F illustrates an embodiment in which the embodiments from FIGS. 3B and 3E are combined. As mentioned previously, any combination of the described embodiments can be used as desired.

FIG. 3G illustrates embodiments in which portions of three affected regions are displayed as outlines 320a, 320b, 322a, 322b, 324a, 324b. In some embodiments, outlines 320a, 320b correspond to vertical and horizontal edges, respectively, of a first affected region, outlines 322a, 322b correspond to vertical and horizontal edges, respectively, of a second affected region, and outlines 324a, 324b correspond to vertical and horizontal edges, respectively, of a third affected region. FIG. 3G also illustrates embodiments in which the one or more outlines 320a, 320b, 322a, 322b, 324a, 324c are associated with a particular affected region, such as portions of a second affected region that are unique to it with respect to a first affected region.

FIG. 3H illustrates an embodiment in which the two affected regions are displayed as two volumes 326, 328, in which the portions of the second affected region that are unique to the second affected region with respect to the first affected region are displayed differently (non-limiting examples: varying opacity, color, brightness, focus, etc.).

FIG. 3I illustrates embodiments in which portions of an affected region are displayed using horizontal bands 330 with alternating display settings (e.g., transparency, brightness, etc.). It will be understood that bands with any orientation can be used. Bands with higher transparency levels can enable a healthcare provider to see into the affected region. In some embodiments, the width and/or the arc length of each band can be equal, and a top portion 332 and a bottom portion 334 can have a variable corresponding width/arc length. As such, user can use the bands to determine distances of object displayed on the screen. The variable width/arc length can be determined such that the bands 330 have an equal width/arc length.

FIG. 3J illustrates an embodiment in which portions of an affected region are displayed using tiles 336 having alternating display settings (e.g., transparency, brightness, etc.). In some embodiments, each tile 336 can be displayed with a different display level. In certain embodiments, every other tile can be displayed with the same display level. Tiles with higher transparency levels can enable a healthcare provider to see into the affected region. In some embodiments, horizontal and/or vertical lengths and/or the arc length of each tile can be equal, and a top portion and a bottom portion can have a variable corresponding length. The variable length can be determined such that the tiles 336 have an equal length.

Surface Display Regions

FIGS. 4A-4C, and 5A-5C are diagrams showing various embodiments for displaying at least a portion of the affected regions that intersect with or are co-located or level with at least a portion of a medical display object (non-limiting examples: the image slice 302 (FIGS. 4A-4C) or the virtual medical device (FIGS. 5A and 5B)), which may also be referred to herein as surface display regions.

The surface display regions can correspond to predicted affected regions and/or dynamic affected regions, as desired. Accordingly, if the first and/or second affected regions are dynamic affected regions, the associated surface display regions can move, or grow, during operation of the medical device associated with the virtual medical device 401, 501.

In addition, the surface display regions can correspond to affected regions that are co-located with the medical display object or only portions thereof, or the medical display object's trajectory. Accordingly, in some embodiments, the surface display region can be displayed as a volume, area, or line depending on which portions of the affected region and medical display object are used to determine the surface display region.

In some embodiments, to determine whether a portion of the affected region and a portion of the medical display object are co-located or level, the system 100 can compare the coordinates of the portion of the affected region with the portion of the medical display object. If the coordinates (e.g., the x, y, z coordinates) match (e.g., are equal) or satisfy a distance threshold, the system can determine that the portion of the medical display object and the portion of the affected region are co-located. In certain embodiments, the system 100 can determine that the portion of the affected region and the portion of the medical display object are co-located if the portion of the affected region and the portion of the medical display object can be mapped to the same pixel in a video or image output data buffer.

The distance threshold can be a predefined distance, such as one or more bits, one or more pixels, etc. In some embodiments, the distance threshold can be based at least in part on whether the distance between the coordinates is perceptible to a user, which may be based at least in part on the size of the display, the size of the display relative to the image and/or imaged area, and/or the distance between the point-of-view location and the display, etc. For example, in some cases, the distance threshold can be smaller for larger displays (or larger display:image ratios) and larger for smaller displays (or smaller display:image ratios), or vice versa. In certain cases, the distance threshold can be larger for larger distances between the point-of-view location and the display and smaller for smaller distances between the point-of-view location and the display, or vice versa. In certain embodiments, the distance threshold can be different for each coordinate.

In certain embodiments, the system 100 can perform the comparison for each location of the medical display objects and/or each location of the affected regions. In some cases, the system can determine that the portion of the medical display object and the portion of the affected region are co-located if the portion of the medical display object and the portion of the affected region are level and have the same depth.

Any coordinate system can be used to compare the coordinates of the portion of the affected region with the medical display object and/or to determine whether the portion of the affected region is co-located with the medical display object. For example, the coordinate system of the display and/or the coordinate system of device in the system 100 that is used to determine the emplacement of the medical devices can be used, as desired.

In some embodiments, the coordinate system of the display is used. The coordinate system of the display can be any emplacement as desired. In certain embodiments, the coordinates of the display are that the x-axis is the width of the display, the y-axis is the height of the display, and the z-axis is the depth (e.g., into and out of) the display. In such embodiments, the system 100 can determine that the portion of the affected region satisfies the location threshold and/or is level with the medical display object, based at least in part on the x and y coordinates of the affected region and the x and y coordinates of the medical display object. For example, if the x and y coordinates of the affected region and the x and y coordinates of the medical display object match (or satisfy a distance threshold); the system 100 can determine that the portion of the affected region satisfies the location threshold.

Although reference is made to the x and y coordinates, it will be understood that the coordinates used to determine whether the portion of the affected region satisfies the location threshold and/or is co-located with the medical display object can be based at least in part on the coordinate system used. For example, in some embodiments, the coordinate system used can include the x-axis as the depth (e.g., forward/backward), the y-axis as lateral movement (e.g., side-to-side), and the z-axis as elevation (e.g., up/down). In such embodiments, the system 100 can determine that portion of the affected region satisfies the location threshold if the y and z coordinates of the affected region match (or satisfy a distance threshold) the y and z coordinates of the medical display object.

In some embodiments, for each location on the display, the system can query whether a portion of the medical display object and/or a portion of the affected region have been (or will be) mapped to that location. If the system 100 determines that a portion of the medical display object and a portion of the affected region have been (or will be) mapped to that location, the system 100 can determine that the portion of the medical display object and the portion of the affected region are co-located.

In certain embodiments, the system 100 can determine that the portion of the affected region satisfies the location threshold, intersects, and/or is co-located with the medical display object if the portion of the affected region and the medical display object (or portion of the image corresponding to the medical display object) are co-located when mapped to a 2D plane. In some embodiments, the 2D plane can be based at least in part on the point-of-view location. For example, the 2D plane can be orthogonal to the point-of-view location. In certain embodiments, the system 100 can determine that the portion of the affected region satisfies the location threshold (or corresponding virtual affected region) if the portion of the affected region overlaps with the medical display object (or portion of the image corresponding to the medical display object) in a virtual image (e.g., one is directly in front of or behind the other in the virtual image). In certain embodiments, the system 100 can determine that the portion of the affected region satisfies the location threshold if the portion of the affected region and the medical display object (or portion of the image corresponding to the medical display object) map to the same location on a display, such as the same pixel or same array of pixels.

With continued reference to FIGS. 4A-4C, various embodiments of surface display regions are shown. It will be understood that the embodiments illustrated in FIGS. 4A-4C are non-limiting in nature. Furthermore, any portion of any of the embodiments described above with reference to FIGS. 3A-3J can be displayed in conjunction with any portion of the embodiments described below with reference to FIGS. 4A-4C.

FIG. 4A illustrates various embodiments including embodiments showing two surface display regions as lines 402, 404. The line 402 can correspond to an outline of a first affected region that is co-located with at least a portion of a virtual medical device (image slice 406) and the line 404 can correspond to an outline of a second affected region that is co-located at least a portion of the image slice 406. In some embodiments, the area between the lines 402, 404 can be displayed differently (non-limiting examples: varying opacity, color, brightness, focus, etc.). FIG. 4A can also illustrate embodiments showing a surface display region displayed as area 405. The area 405 can correspond to portions of a second affected region that are unique to it with respect to a first affected region and that are co-located with at least a portion of the image slice 406. In certain embodiments, the portions of the area 405 can be displayed differently (non-limiting examples: varying opacity, color, brightness, focus, etc.). FIG. 5A is similar to FIG. 4A except that the medical display object is the virtual medical device 501. Thus, lines 502, 504 correspond to lines 402, 404, respectively, and area 505 corresponds to area 405. In some embodiments lines 502, 505, and/or area 505 can correspond to portions of a second affected region that are co-located with at least a portion of the virtual medical device 501 and/or its trajectory.

FIG. 4B illustrates various embodiments showing multiple solid lines 408, 410 and dashed lines 412, 414, which can form part of one or more surface display regions. For example, lines 408, 410 can correspond different portions of a first affected region (and form part of a first surface display region) and lines 412, 414 can correspond to different portions of a second affected region (and form part of a surface display region). In some embodiments, line 408 forms part of a first surface display region and lines 410, 412, 414 form part of a second surface display region, such as portions of the second affected region that are unique to it with respect to the first affected region. In certain embodiments, lines 408, 410, 412, 414 form part of four surface display regions. FIG. 5A is similar to FIG. 4A except that the medical display object is the virtual medical device 501. Thus, lines 508, 510, 512, 514 correspond to lines 408, 410, 412, 414, respectively, and area 505 corresponds to area 405.

FIG. 4C illustrates an embodiment in which an additional surface display region is displayed with the surface display regions illustrated in FIG. 4B. In an embodiment, line 408 forms part of a first surface display region that corresponds to a first predicted affected region and lines 410, 412, 414 form part of a second surface display region that corresponds to a second predicted affected region. Indicators 416 form part of a third surface display region, which corresponds to a dynamic affected region. As such, during operation of the medical device that corresponds to the virtual medical device 401, lines 408, 410, 412, and 414 can remain static, while indicators 416 move outward. In this manner, a healthcare provider can determine at what point to terminate operation of the medical device. Although a corresponding FIG. 5 is not provided, it will be understood that the medical display object can correspond to any object displayed by a display.

It will be understood that any of the aforementioned embodiments from FIGS. 3A-3J, 4A-4C, 5A, and 5B can be combined as desired. For example, FIG. 5C is a diagram illustrating an embodiment in which the embodiments described above with reference to FIGS. 3D, 3G, 4C, and 5B are combined.

FIG. 6 is a flow diagram illustrative of an embodiment of a routine 600 implemented by the system 100 to display at least a portion of an affected region, or displayed affected region. One skilled in the relevant art will appreciate that the elements outlined for routine 600 can be implemented by one or more computing devices/components that are associated with the system 100, such as the position sensing unit 140, the image guidance unit 130, surgical system 149, and/or imaging unit 150. Accordingly, routine 600 has been logically associated as being generally performed by the system 100. However, the following illustrative embodiment should not be construed as limiting. Furthermore, it will be understood that the various blocks described herein with reference to FIG. 6 can be implemented in a variety of orders. For example, the system may implement some blocks concurrently or change the order, as desired.

At block 602, the system 100 receives operating parameters of a medical device. As described in greater detail above, the operating parameters can include information regarding make and model, power and duration settings of the medical device, and/or variance parameters, etc. The operating parameters can be stored in a non-transitory, computer-readable medium associated with the system 100 and/or can be stored in the medical device.

At block 604, the system 100 determines a first affected region. As described previously, affected regions can correspond to predicted affected regions and/or a dynamic affected regions. In some embodiments, the system determines the first affected region based at least in part on the operating parameters and/or measured parameters. In certain embodiments, the system 100 determines the first affected region based at least in part on a variance parameter of the medical device.

At block 606, the system 100 causes one or more displays to display at least a portion of the first affected region, or first displayed affected region. As described previously, the first displayed affected region can be displayed in a 2D view or 3D view and/or as a perspective view.

In addition, as described in greater detail above, the displayed affected region can be displayed as a volume, area, and/or line. The displayed affected region can be wire-framed, transparent, semi-transparent, have varied opacity, brightness, and/or focus, include alternating bands/tiles, be textured, include solid or dashed lines, include spikes, etc. In certain embodiments, the at least a portion of the first affected region corresponds to portions of the first affected region that are unique to it, with respect to other affected regions. In some embodiments, the displayed affected region corresponds to at least a portion of the first affected region that is co-located with at least a portion of a medical display object (non-limiting examples: a virtual medical device, image slice, etc.) or its trajectory, also referred to as the surface display region.

It will be understood that fewer, more, or different blocks can be used as part of the routine 600. For example, any combination of blocks 608, 610, 612, and 614 can be included as part of routine 600.

At block 608, the system 100 determines a second affected region. The second affected region can be determined in a manner similar to the first affected region. As described in greater detail above, in some embodiments, the variance parameter can be used to determine the first and second affected regions. For example, a first variance threshold can be used to determine the first affected region and a second variance threshold can be used to determine the second affected region. In certain cases, the first variance threshold can be less than the second variance threshold. In such embodiments, the second affected region can be larger than, and in some cases include, the first affected region.

At block 610, the system 100 causes one or more displays to display at least a portion of the second affected region, or second displayed affected region. The system 100 can cause the one or more displays to display the second displayed affected region similar to the first displayed affected region. In some embodiments, the second displayed affected region can be displayed differently, such as by using a different color, transparency level, focus setting, shape, texture, etc. Furthermore, in some embodiments, the system can omit causing the display of the first displayed affected region in favor of the second displayed affected region. In such embodiments, the second displayed affected region can, in some instances, correspond to the portions of the second affected region that are unique to the second affected region, with respect to the first affected region.

In some instances the first affected region can be a predicted affected region and the second affected region can be a dynamic affected region. As such, in certain embodiments, during operation of the medical device, the second displayed affected region can change and/or grow with respect to the first displayed affected region.

At block 612, the system 100 determines a third affected region. For example, in some embodiments, the first and second affected regions can be first and second predicted affected regions and the third affected region can be a dynamic affected region. However, in certain embodiments the three affected regions can be dynamic affected regions, predicted affected regions, or any combination thereof.

At block 614, the system 100 causes one or more displays to display at least a portion of the third affected region, or third displayed affected region. The system 100 can cause the one or more displays to display the third displayed affected region similar to the first and second displayed affected regions. In some embodiments, the third displayed affected region can be displayed differently, such as by using a different color, transparency level, focus setting, shape, texture, etc. In embodiments, where the third affected region is a dynamic affected region and the first and second affected regions are predicted affected regions, the third displayed affected region can move with respect to the first and second displayed affected regions.

With continued reference to FIG. 6, it will be understood that fewer or more blocks can be included. For example, as described in greater detail above, the system 100 can receive emplacement data from one or more sensors corresponding to one or more medical devices, determine a emplacement of the medical devices and/or corresponding virtual medical devices (as a non-limiting example, the emplacement can be determined based at least in part on a point-of-view location), cause one or more displays to display the virtual medical devices and/or perspective views thereof, determine emplacement of the affected regions with respect to the virtual medical devices, determine emplacement of and display an image slice, alter the display of the image slice, display the medical display objects in a 2D view, a 3D view, and/or a perspective view, etc.

FIG. 7 is a flow diagram illustrative of an embodiment of a routine 700 implemented by the system 100 to display at least a portion of multiple affected regions, or multiple displayed affected regions. One skilled in the relevant art will appreciate that the elements outlined for routine 700 can be implemented by one or more computing devices/components that are associated with the system 100, such as the position sensing unit 140, the image guidance unit 130, surgical system 149, and/or imaging unit 150. Accordingly, routine 700 has been logically associated as being generally performed by the system 100. However, the following illustrative embodiment should not be construed as limiting. Furthermore, it will be understood that the various blocks described herein with reference to FIG. 7 can be implemented in a variety of orders. For example, the system may implement some blocks concurrently or change the order, as desired.

At block 702, the system 100 determines the emplacement of a virtual medical device. In some embodiments, as described above, the system 100 can determine the emplacement of the virtual medical device based at least in part on emplacement data corresponding to a medical device, such as medical device 242. As described previously, the emplacement data can be received from one or more emplacement sensors associated with the medical device.

At block 704, the system 100 obtains operating parameters of the medical device, as described in greater detail above with reference to block 602 of FIG. 6. At block 706, the system 100 determines a first affected region. In some embodiments, the system 100 can determine the first affected region similar to the determination of the first affected region referenced above with respect to block 604 of FIG. 6. At block 708, the system 100 determines a second affected region. In some embodiments, the system 100 can determine the second affected region similar to the determination of the second affected region referenced above with respect to block 608 of FIG. 6. As described in greater detail above the first and second affected regions can be predicted affected regions and/or dynamic affected regions, as desired.

At block 710, the system 100 can determine the emplacement of the first and second affected regions. At block 712, the system 100 can cause one or more displays to display at least a portion of the virtual medical device, at least a portion of the first affected region (second displayed affected region), and at least a portion of the second affected region (second displayed affected region). As described previously, the displayed affected regions can, in some embodiments, correspond to surface display regions. In some embodiments, the system 100 can cause the one or more displays to display at least a portion of the first and second affected regions similar to blocks 606 and 610 of FIG. 6, described previously. As described above, the first and second displayed affected regions can be displayed in a variety of ways.

It will be understood that fewer, more, or different blocks can be used as part of the routine 700. For example, any combination of blocks 714, 716, 718, and 720 can be included as part of routine 700.

At block 714, the system 100 can determine a third affected region, and at block 716, the system 100 can cause the one or more displays to display at least a portion of the third affected region. In some embodiments, the system 100 can determine the third affected region and display a third displayed affected region similar to the first and second displayed affected regions as described in greater detail above with reference to blocks 612 and 616 of FIG. 6. In certain embodiments, the third displayed affected region can be a surface display region.

At block 718, the system 100 can determine an emplacement of an image slice, and at block 720, the system 100 can cause the one or more displays to display at least a portion of the image slice and/or perspective view thereof, as described in greater detail above.

With continued reference to FIG. 7, it will be understood that fewer or more blocks can be included. For example, as described in greater detail above, the system 100 can receive emplacement data from one or more sensors associated with one or more medical devices, display additional displayed affected regions, alter the display of the image slice, display image guidance cues, display the medical display objects in a 2D view, a 3D view, and/or a perspective view, etc.

Those having skill in the art will further appreciate that the various illustrative logical blocks, modules, circuits, and process steps described in connection with the implementations disclosed herein can be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans can implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention. One skilled in the art will recognize that a portion, or a part, can comprise something less than, or equal to, a whole. For example, a portion of a collection of pixels can refer to a sub-collection of those pixels.

The various illustrative logical blocks, modules, and circuits described in connection with the implementations disclosed herein can be implemented or performed with a processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A processor can be a microprocessor, but in the alternative, the processor can be any conventional processor, controller, or microcontroller. A processor can also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.

The steps of a method or process described in connection with the implementations disclosed herein can be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module can be stored in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of non-transitory computer-readable medium known in the art, as computer-executable instructions. An exemplary computer-readable storage medium is coupled to the processor such the processor can read information and/or computer-executable instructions from, and write information to, the computer-readable storage medium. In the alternative, the storage medium can be integral to the processor. The processor and the storage medium can reside in an ASIC. The ASIC can reside in a user terminal, camera, or other device. In the alternative, the processor and the storage medium can reside as discrete components in a user terminal, camera, or other device.

Headings are included herein for reference and to aid in locating various sections. These headings are not intended to limit the scope of the concepts described with respect thereto. Such concepts can have applicability throughout the entire specification.

Conditional language such as, among others, “can,” “could,” “might” or “may,” unless specifically stated otherwise, are otherwise understood within the context as used in general to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or steps. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without user input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular embodiment.

Language such as the phrase “at least one of X, Y and Z,” and “at least one of X, Y or Z,” unless specifically stated otherwise, is understood with the context as used in general to convey that an item, term, etc. may be either X, Y or Z, or any combination thereof. Thus, such language is not generally intended to imply that certain embodiments require at least one of X, at least one of Y and at least one of Z to each be present or exclusively X or exclusively Y or exclusively Z.

Unless otherwise explicitly stated, articles such as ‘a’ or ‘an’ should generally be interpreted to include one or more described items. Accordingly, phrases such as “a device configured to” are intended to include one or more recited devices. Such one or more recited devices can also be collectively configured to carry out the stated recitations. For example, “a processor configured to carry out recitations A, B and C” can include a first processor configured to carry out recitation A working in conjunction with a second processor configured to carry out recitations B and C.

The previous description of the disclosed implementations is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these implementations will be readily apparent to those skilled in the art, and the generic principles defined herein can be applied to other implementations without departing from the spirit or scope of the invention. Furthermore, although described above with reference to medical devices and procedures, it will be understood that the embodiments described herein can be applied to other systems in which objects are tracked and virtual representations are displayed on a display and/or systems in which multiple objects are displayed on a display within a virtual space, such as within a virtual 3D space. Thus, the present invention is not intended to be limited to the implementations shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

State, Andrei, Heaney, Brian, Kohli, Luv, Razzaque, Sharif

Patent Priority Assignee Title
11103200, Jul 22 2015 InnerOptic Technology, Inc. Medical device approaches
11179136, Feb 17 2016 InnerOptic Technology, Inc. Loupe display
11259879, Aug 01 2017 INNEROPTIC TECHNOLOGY, INC Selective transparency to assist medical device navigation
11369439, Oct 27 2016 InnerOptic Technology, Inc. Medical device navigation using a virtual 3D space
11464575, Feb 17 2009 InnerOptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image guided surgery
11464578, Feb 17 2009 INNEROPTIC TECHNOLOGY, INC Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
11481868, Aug 02 2006 InnerOptic Technology, Inc. System and method of providing real-time dynamic imagery of a medical procedure she using multiple modalities
11684429, Oct 02 2014 InnerOptic Technology, Inc. Affected region display associated with a medical device
11931117, Dec 12 2014 InnerOptic Technology, Inc. Surgical guidance intersection display
Patent Priority Assignee Title
10127629, Aug 02 2006 InnerOptic Technology, Inc. System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities
10136951, Feb 17 2009 InnerOptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image guided surgery
10188467, Dec 12 2014 INNEROPTIC TECHNOLOGY, INC Surgical guidance intersection display
10398513, Feb 17 2009 InnerOptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
10433814, Feb 17 2016 InnerOptic Technology, Inc. Loupe display
3556079,
4058114, Sep 11 1974 Siemens Aktiengesellschaft Ultrasonic arrangement for puncturing internal body organs, vessels and the like
4249539, Feb 09 1979 Technicare Corporation Ultrasound needle tip localization system
4294544, Aug 03 1979 Topographic comparator
4390025, Apr 24 1980 Tokyo Shibaura Denki Kabushiki Kaisha Ultrasonic display apparatus having means for detecting the position of an ultrasonic probe
4407294, Jan 07 1982 Technicare Corporation Ultrasound tissue probe localization system
4431006, Jan 07 1982 Technicare Corporation Passive ultrasound needle probe locator
4567896, Jan 20 1984 ELSCINT, INC , A CORP OF MA Method and apparatus for calibrating a biopsy attachment for ultrasonic imaging apparatus
4583538, May 04 1984 Sherwood Services AG Method and apparatus for stereotaxic placement of probes in the body utilizing CT scanner localization
4620546, Jun 30 1984 Kabushiki Kaisha Toshiba Ultrasound hyperthermia apparatus
4671292, Apr 30 1985 Dymax Corporation Concentric biopsy probe
4839836, Mar 11 1986 U S PHILIPS CORPORATION, A CORP OF DE Signal transient improvement arrangement
4862873, May 27 1987 Olympus Optical Co., Ltd. Stereo endoscope
4884219, Jan 21 1987 Intel Corporation Method and apparatus for the perception of computer-generated imagery
4899756, Jul 18 1988 Articulated needle guide for ultrasound imaging and method of using same
4911173, Nov 13 1987 DIASONICS, INC Biopsy attachment for ultrasound probe
4945305, Oct 09 1986 BAE SYSTEMS PLC Device for quantitatively measuring the relative position and orientation of two bodies in the presence of metals utilizing direct current magnetic fields
5076279, Jul 17 1990 ACUSON CORPORATION, A CORP OF DE Needle guide for assembly upon an ultrasound imaging transducer
5078140, May 08 1986 Imaging device - aided robotic stereotaxis system
5078142, Nov 21 1989 Siemens AG Precision mammographic needle biopsy system
5095910, Apr 18 1990 Advanced Technology Laboratories, Inc.; ADVANCED TECHNOLOGY LABORATORIES, INC , A CORP OF WASHINGTON Ultrasonic imaging of biopsy needle
5109276, May 27 1988 The University of Connecticut Multi-dimensional multi-spectral imaging system
5158088, Nov 14 1990 Advanced Technology Laboratories, Inc. Ultrasonic diagnostic systems for imaging medical instruments within the body
5161536, Mar 22 1991 ECHO CATH, INC ; ECHO CATH, LTD Ultrasonic position indicating apparatus and methods
5193120, Feb 27 1991 Mechanical Technology Incorporated Machine vision three dimensional profiling system
5209235, Sep 13 1991 CARDIOVASCULAR IMAGING SYSTEMS, INC Ultrasonic imaging catheter assembly and method for identification of the same
5249581, Jul 15 1991 BANK OF MONTREAL Precision bone alignment
5251127, Feb 01 1988 XENON RESEARCH, INC Computer-aided surgery apparatus
5261404, Jul 08 1991 Three-dimensional mammal anatomy imaging system and method
5265610, Sep 03 1991 General Electric Company Multi-planar X-ray fluoroscopy system using radiofrequency fields
5271400, Apr 01 1992 General Electric Company Tracking system to monitor the position and orientation of a device using magnetic resonance detection of a sample contained within the device
5307153, Jun 19 1990 Fujitsu Limited Three-dimensional measuring apparatus
5309913, Nov 30 1992 The Cleveland Clinic Foundation; CLEVELAND CLINIC FOUNDATION, THE Frameless stereotaxy system
5323002, Mar 25 1992 Texas Instruments Incorporated Spatial light modulator based optical calibration system
5371543, Mar 03 1993 Texas Instruments Incorporated Monolithic color wheel
5383454, Oct 19 1990 St. Louis University System for indicating the position of a surgical probe within a head on an image of the head
5394875, Oct 21 1993 MARKER, LLC Automatic ultrasonic localization of targets implanted in a portion of the anatomy
5411026, Oct 08 1993 Best Medical International, Inc Method and apparatus for lesion position verification
5433198, Mar 11 1993 CATHEFFECTS, INC Apparatus and method for cardiac ablation
5433739, Nov 02 1993 Covidien AG; TYCO HEALTHCARE GROUP AG Method and apparatus for heating an intervertebral disc for relief of back pain
5443489, Jul 20 1993 Biosense, Inc. Apparatus and method for ablation
5446798, Jun 20 1989 Fujitsu Limited Method and apparatus for measuring position and orientation of an object based on a sequence of projected points
5447154, Jul 31 1992 UNIVERSITE JOSEPH FOURIER Method for determining the position of an organ
5452024, Nov 01 1993 Texas Instruments Incorporated DMD display system
5457493, Sep 15 1993 Texas Instruments Incorporated Digital micro-mirror based image simulation system
5474073, Nov 22 1994 ADVANCED TECHNOLOGIES LABORATORIES, INC Ultrasonic diagnostic scanning for three dimensional display
5476096, Feb 09 1994 Vingmed Sound A/S Analysis and measurement of temporal tissue variations
5483961, Mar 19 1993 COMPASS INTERNATIONAL, INC Magnetic field digitizer for stereotactic surgery
5488431, Nov 04 1993 Texas Instruments Incorporated Video data formatter for a multi-channel digital television system without overlap
5489952, Jul 14 1993 Texas Instruments Incorporated Method and device for multi-format television
5491510, Dec 03 1993 Texas Instruments Incorporated System and method for simultaneously viewing a scene and an obscured object
5494039, Jul 16 1993 ENDOCARE, INC Biopsy needle insertion guide and method of use in prostate cryosurgery
5503152, Sep 28 1994 Hoechst AG Ultrasonic transducer assembly and method for three-dimensional imaging
5505204, May 13 1993 Victoria Hospital Corporation; London Health Association Ultrasonic blood volume flow rate meter
5515856, Jan 27 1996 Vingmed Sound A/S Method for generating anatomical M-mode displays
5517990, Nov 30 1992 CLEVELAND CLINIC FOUNDATION, THE Stereotaxy wand and tool guide
5526051, Oct 27 1993 Texas Instruments Incorporated Digital television system
5526812, Jun 21 1993 General Electric Company Display system for enhancing visualization of body structures during medical procedures
5529070, Nov 22 1990 Advanced Technology Laboratories, Inc. Acquisition and display of ultrasonic images from sequentially oriented image planes
5531227, Jan 28 1994 SCHNEIDER MEDICAL TECHNOLOGIES, INC Imaging device and method
5532997, Jun 06 1990 Texas Instruments Incorporated Optical tracking system
5541723, Jun 21 1993 Minolta Camera Kabushiki Kaisha Distance measuring device
5558091, Oct 06 1993 Biosense, Inc Magnetic determination of position and orientation
5568811, Oct 04 1994 Vingmed Sound A/S Method for motion encoding of tissue structures in ultrasonic imaging
5570135, Jul 14 1993 Texas Instruments Incorporated Method and device for multi-format television
5579026, May 14 1993 Olympus Optical Co., Ltd. Image display apparatus of head mounted type
5581271, Dec 05 1994 Raytheon Company; HE HOLDINGS, INC , A DELAWARE CORP Head mounted visual display
5588948, Feb 17 1993 Olympus Optical Co. Ltd. Stereoscopic endoscope
5608468, Jul 14 1993 Texas Instruments Incorporated Method and device for multi-format television
5608849, Aug 27 1991 Method of visual guidance for positioning images or data in three-dimensional space
5611345, Apr 24 1995 Medical instrument with improved ultrasonic visibility
5611353, Jun 21 1993 HOWMEDICA OSTEONICS CORP Method and apparatus for locating functional structures of the lower leg during knee surgery
5612753, Jan 27 1995 Texas Instruments Incorporated Full-color projection display system using two light modulators
5625408, Jun 24 1993 Canon Kabushiki Kaisha Three-dimensional image recording/reconstructing method and apparatus therefor
5628327, Dec 15 1994 IMARX THERAPEUTICS, INC Apparatus for performing biopsies and the like
5629794, May 31 1995 Texas Instruments Incorporated Spatial light modulator having an analog beam for steering light
5630027, Dec 28 1994 Texas Instruments Incorporated Method and apparatus for compensating horizontal and vertical alignment errors in display systems
5647361, Dec 18 1992 Fonar Corporation Magnetic resonance imaging method and apparatus for guiding invasive therapy
5647373, Nov 07 1993 ULTRA-GUIDE, LTD Articulated needle guide for ultrasound imaging and method of using same
5660185, Apr 13 1995 NeoVision Corporation Image-guided biopsy apparatus with enhanced imaging and methods
5662111, Jan 28 1991 INTEGRA RADIONICS, INC Process of stereotactic optical navigation
5699444, Mar 31 1995 JADE PIXEL, LLC Methods and apparatus for using image data to determine camera location and orientation
5701898, Sep 02 1994 UNITED STATES OF AMERICA, THE, AS REPRESENTED BY THE SECRETARY, DEPT OF HEALTH AND HUMAN SERVICES Method and system for Doppler ultrasound measurement of blood flow
5701900, May 01 1995 Cedars-Sinai Medical Center Ultrasonic transducer orientation sensing and display apparatus and method
5726670, Jul 20 1992 Olympus Optical Co., Ltd. Display apparatus to be mounted on the head or face of an individual
5728044, Mar 10 1995 Sensor device for spacial imaging of endoscopes
5758650, Sep 30 1996 Siemens Medical Solutions USA, Inc Universal needle guide for ultrasonic transducers
5766135, Mar 08 1995 Echogenic needle tip
5784098, Aug 28 1995 Olympus Optical Co., Ltd. Apparatus for measuring three-dimensional configurations
5792147, Mar 17 1994 KING S COLLEGE LONDON Video-based systems for computer assisted surgery and localisation
5793701, Apr 07 1995 Acuson Corporation Method and apparatus for coherent image formation
5797849, Mar 28 1995 Sonometrics Corporation Method for carrying out a medical procedure using a three-dimensional tracking and imaging system
5807395, Aug 27 1993 Medtronic, Inc. Method and apparatus for RF ablation and hyperthermia
5810008, Dec 03 1996 Brainlab AG Apparatus and method for visualizing ultrasonic images
5817022, Mar 28 1995 Sonometrics Corporation System for displaying a 2-D ultrasound image within a 3-D viewing environment
5820554, May 31 1993 Medtronic, Inc. Ultrasound biopsy needle
5820561, Jul 30 1996 Vingmed Sound A/S Analysis and measurement of temporal tissue velocity information
5829439, Jun 28 1995 Hitachi Medical Corporation Needle-like ultrasonic probe for ultrasonic diagnosis apparatus, method of producing same, and ultrasonic diagnosis apparatus using same
5829444, Sep 15 1994 GE Medical Systems Global Technology Company, LLC Position tracking and imaging system for use in medical applications
5851183, Oct 19 1990 St. Louis University System for indicating the position of a surgical probe within a head on an image of the head
5870136, Dec 05 1997 The University of North Carolina at Chapel Hill Dynamic generation of imperceptible structured light for tracking and acquisition of three dimensional scene geometry and surface characteristics in interactive three dimensional computer graphics applications
5891034, Oct 19 1990 ST LOUIS UNIVERSITY System for indicating the position of a surgical probe within a head on an image of the head
5920395, Apr 22 1993 IMAGE GUIDED TECHNOLOGIES, INC System for locating relative positions of objects in three dimensional space
5961527, Jan 22 1997 CIVCO MEDICAL INSTRUMENTS CO , INC Omni-directional precision instrument platform
5967980, Sep 15 1994 GE Medical Systems Global Technology Company, LLC Position tracking and imaging system for use in medical applications
5967991, Dec 03 1996 EchoCath, Inc. Drive apparatus for an interventional medical device used in an ultrasonic imaging system
5991085, Apr 21 1995 i-O Display Systems LLC Head-mounted personal visual display apparatus with image generator and holder
6016439, Oct 15 1996 Biosense, Inc Method and apparatus for synthetic viewpoint imaging
6019724, Feb 08 1996 Sonowand AS Method for ultrasound guidance during clinical procedures
6048312, Apr 23 1998 General Electric Company Method and apparatus for three-dimensional ultrasound imaging of biopsy needle
6064749, Aug 02 1996 CHAPEL HILL, UNIVERSITY OF NORTH CAROLINA, THE Hybrid tracking for augmented reality using both camera motion detection and landmark tracking
6091546, Oct 30 1997 GOOGLE LLC Eyeglass interface system
6095982, Mar 14 1995 Board of Regents, The University of Texas System Spectroscopic method and apparatus for optically detecting abnormal mammalian epithelial tissue
6099471, Oct 07 1997 G E VINGMED ULTRASOUND AS Method and apparatus for real-time calculation and display of strain in ultrasound imaging
6108130, Sep 10 1999 U S BANK NATIONAL ASSOCIATION, AS COLLATERAL AGENT Stereoscopic image sensor
6122538, Jan 16 1997 Siemens Medical Solutions USA, Inc Motion--Monitoring method and system for medical devices
6122541, May 04 1995 INTEGRA BURLINGTON MA, INC Head band for frameless stereotactic registration
6160666, Mar 23 1998 i-O Display Systems LLC Personal visual display system
6167296, Jun 28 1996 CICAS IP LLC Method for volumetric image navigation
6181371, May 30 1995 MAGUIRE, SUSAN C Apparatus for inducing attitudinal head movements for passive virtual reality
6216029, Jul 16 1995 Trig Medical Ltd Free-hand aiming of a needle guide
6241725, Dec 15 1993 Covidien AG; TYCO HEALTHCARE GROUP AG High frequency thermal ablation of cancerous tumors and functional targets with image data assistance
6245017, Oct 30 1998 Toshiba Medical Systems Corporation 3D ultrasonic diagnostic apparatus
6246784, Aug 19 1997 HEALTH AND HUMAN SERVICES, UNITED STATES OF AMERICA, AS REPRESENTED BY THE SECRETARY, DEPARTMENT OF, THE; HEALTH AND HUMAN SERVICES, THE GOVERNMENT OF THE UNITED STATES OF AMERICA AS REPRESENTED BY THE SECRETARY OF THE DEPARTMENT OF Method for segmenting medical images and detecting surface anomalies in anatomical structures
6246898, Mar 28 1995 Sonometrics Corporation Method for carrying out a medical procedure using a three-dimensional tracking and imaging system
6248101, Jan 22 1997 CIVCO MEDICAL INSTRUMENTS CO , INC Omni-directional precision instrument platform
6261234, May 07 1998 Diasonics Ultrasound, Inc. Method and apparatus for ultrasound imaging with biplane instrument guidance
6341016, Aug 06 1999 Method and apparatus for measuring three-dimensional shape of object
6348058, Dec 12 1997 SOFAMOR DANEK GROUP, INC Image guided spinal surgery guide, system, and method for use thereof
6350238, Nov 02 1999 GE Medical Systems Global Technology Company, LLC Real-time display of ultrasound in slow motion
6352507, Aug 23 1999 G.E. Vingmed Ultrasound AS Method and apparatus for providing real-time calculation and display of tissue deformation in ultrasound imaging
6379302, Oct 28 1999 Medtronic Navigation, Inc Navigation information overlay onto ultrasound imagery
6385475, Mar 11 1997 Aesculap AG Process and device for the preoperative determination of the positioning data of endoprosthetic parts
6442417, Nov 29 1999 STRYKER EUROPEAN HOLDINGS III, LLC Method and apparatus for transforming view orientations in image-guided surgery
6447450, Nov 02 1999 GE Medical Systems Global Technology Company, LLC ECG gated ultrasonic image compounding
6456868, Mar 30 1999 Olympus Corporation Navigation apparatus and surgical operation image acquisition/display apparatus using the same
6470207, Mar 23 1999 Medtronic Navigation, Inc Navigational guidance via computer-assisted fluoroscopic imaging
6471366, Jul 30 2001 The United States of America as represented by the Secretary of the Navy Depth-compensated underwater light
6477400, Aug 20 1998 SOFAMOR DANEK HOLDINGS, INC Fluoroscopic image guided orthopaedic surgery system with intraoperative registration
6478793, Jun 11 1999 Covidien AG Ablation treatment of bone metastases
6503195, May 24 1999 UNIVERSITY OF NORTH CAROLINA, THE Methods and systems for real-time structured light depth extraction and endoscope using real-time structured light depth extraction
6511418, Mar 30 2000 CICAS IP LLC Apparatus and method for calibrating and endoscope
6517485, Aug 23 1999 G.E. Vingmed Ultrasound AS Method and apparatus for providing real-time calculation and display of tissue deformation in ultrasound imaging
6518939, Nov 08 1996 Olympus Optical Co., Ltd. Image observation apparatus
6527443, Apr 20 1999 Brainlab AG Process and apparatus for image guided treatment with an integration of X-ray detection and navigation system
6529758, Jun 28 1996 CICAS IP LLC Method and apparatus for volumetric image navigation
6537217, Aug 24 2001 GE Medical Systems Global Technology Company, LLC Method and apparatus for improved spatial and temporal resolution in ultrasound imaging
6545706, Jul 30 1999 DIGIMEDIA TECH, LLC System, method and article of manufacture for tracking a head of a camera-generated image of a person
6546279, Oct 12 2001 FLORIDA, UNIVERSITY OF Computer controlled guidance of a biopsy needle
6551325, Sep 26 2000 Brainlab AG Device, system and method for determining the position of an incision block
6570566, Jun 10 1999 Sony Corporation Image processing apparatus, image processing method, and program providing medium
6575969, May 04 1995 Covidien AG; TYCO HEALTHCARE GROUP AG Cool-tip radiofrequency thermosurgery electrode system for tumor ablation
6579240, Jun 12 2001 GE Medical Systems Global Technology Company, LLC Ultrasound display of selected movement parameter values
6587711, Jul 22 1999 RESEARCH FOUNDATION OF CITY COLLEGE OF NEW YORK, THE Spectral polarizing tomographic dermatoscope
6591130, Jun 28 1996 CICAS IP LLC Method of image-enhanced endoscopy at a patient site
6592522, Jun 12 2001 GE Medical Systems Global Technology Company, LLC Ultrasound display of displacement
6594517, May 15 1998 Robin Medical, Inc. Method and apparatus for generating controlled torques on objects particularly objects inside a living body
6597818, May 09 1997 SRI International Method and apparatus for performing geo-spatial registration of imagery
6604404, Dec 31 1997 Trig Medical Ltd Calibration method and apparatus for calibrating position sensors on scanning transducers
6616610, Nov 16 2000 GE Medical Systems Kretztechnik GmbH & Co oHG Method for determination of the direction of introduction and for controlling the introduction path of biopsy needles
6626832, Apr 15 1999 Trig Medical Ltd Apparatus and method for detecting the bending of medical invasive tools in medical interventions
6652462, Jun 12 2001 GE Medical Systems Global Technology Company, LLC. Ultrasound display of movement parameter gradients
6669635, Oct 28 1999 Surgical Navigation Technologies, Inc. Navigation information overlay onto ultrasound imagery
6676599, Aug 23 1999 G.E. Vingmed Ultrasound AS Method and apparatus for providing real-time calculation and display of tissue deformation in ultrasound imaging
6689067, Nov 28 2001 Siemens Corporation Method and apparatus for ultrasound guidance of needle biopsies
6695786, Mar 16 2001 U-Systems, Inc Guide and position monitor for invasive medical instrument
6711429, Sep 24 1998 Covidien LP System and method for determining the location of a catheter during an intra-body medical procedure
6725082, Mar 17 1999 AO Technology AG System and method for ligament graft placement
6733458, Sep 25 2001 Siemens Medical Solutions USA, Inc Diagnostic medical ultrasound systems and methods using image based freehand needle guidance
6764449, Dec 31 2001 Medison Co., Ltd. Method and apparatus for enabling a biopsy needle to be observed
6766184, Mar 28 2000 Board of Regents, The University of Texas System Methods and apparatus for diagnostic multispectral digital imaging
6768496, Mar 30 2000 Siemens Healthcare GmbH System and method for generating an image from an image dataset and a video image
6775404, Mar 18 1999 Washington, University of Apparatus and method for interactive 3D registration of ultrasound and magnetic resonance images based on a magnetic position sensor
6782287, Jun 27 2000 STRYKER EUROPEAN HOLDINGS III, LLC Method and apparatus for tracking a medical instrument based on image registration
6783524, Apr 19 2001 KRANOS IP II CORPORATION Robotic surgical tool with ultrasound cauterizing and cutting instrument
6827723, Feb 27 2001 Smith & Nephew, Inc Surgical navigation systems and processes for unicompartmental knee arthroplasty
6863655, Jun 12 2001 GE Medical Systems Global Technology Company, LLC Ultrasound display of tissue, tracking and tagging
6873867, Apr 05 2000 BrianLAB AG Referencing or registering a patient or a patient body part in a medical navigation system by means of irradiation of light points
6875179, Jun 17 2002 BOARD OF TRUSTEES OF THE UNIVERSITY OF ARKANSAS Ultrasonic guided catheter deployment system
6881214, Jun 11 1999 Covidien AG; TYCO HEALTHCARE GROUP AG Ablation treatment of bone metastases
6895268, Jun 28 1999 Siemens Aktiengesellschaft Medical workstation, imaging system, and method for mixing two images
6915150, Mar 11 1997 Aesculap AG Process and device for the preoperative determination of the positioning data of endoprosthetic parts
6917827, Nov 17 2000 GE Medical Systems Global Technology Company, LLC Enhanced graphic features for computer assisted surgery system
6923817, Feb 27 2001 Smith & Nephew, Inc Total knee arthroplasty systems and processes
6936048, Jan 16 2003 Gynesonics, Inc Echogenic needle for transvaginal ultrasound directed reduction of uterine fibroids and an associated method
6947783, Sep 26 2000 Smith & Nephew, Inc; SMITH & NEPHEW ORTHOPAEDICS AG; SMITH & NEPHEW PTE LIMITED System for the navigation-assisted positioning of elements
6968224, Oct 28 1999 Surgical Navigation Technologies, Inc. Method of detecting organ matter shift in a patient
6978167, Jul 01 2002 CLARONAV INC Video pose tracking system and method
7008373, Nov 08 2001 Johns Hopkins University, The System and method for robot targeting under fluoroscopy based on image servoing
7033360, Mar 11 1997 Aesculap AG Process and device for the preoperative determination of the positioning data endoprosthetic parts
7072707, Jun 27 2001 Vanderbilt University Method and apparatus for collecting and processing physical space data for use while performing image-guided surgery
7077807, Aug 23 1999 G.E. Vingmed Ultrasound AS Method and apparatus for providing real-time calculation and display of tissue deformation in ultrasound imaging
7093012, Sep 14 2000 R2 SOLUTIONS LLC System and method for enhancing crawling by extracting requests for webpages in an information flow
7110013, Mar 15 2000 Information Decision Technologies, LLC Augmented reality display integrated with self-contained breathing apparatus
7171255, Jul 26 1995 CMSI HOLDINGS CORP ; IMPAC MEDICAL SYSTEMS, INC Virtual reality 3D visualization for surgical procedures
7209776, Dec 03 2002 Aesculap AG Method of determining the position of the articular point of a joint
7245746, Jun 12 2001 GE Medical Systems Global Technology Company, LLC Ultrasound color characteristic mapping
7248232, Feb 25 1998 SEMICONDUCTOR ENERGY LABORATORY CO , LTD Information processing device
7261694, Aug 23 1999 G.E. Vingmed Ultrasound AS Method and apparatus for providing real-time calculation and display of tissue deformation in ultrasound imaging
7313430, Aug 28 2003 Medtronic Navigation, Inc. Method and apparatus for performing stereotactic surgery
7331932, Dec 15 2000 Aesculap AG Method and device for determining the mechanical axis of a femur
7351205, Jan 03 2003 CIVCO Medical Instruments Co., Inc. Shallow angle needle guide apparatus and method
7379769, Sep 30 2003 Hologic, Inc; Biolucent, LLC; Cytyc Corporation; CYTYC SURGICAL PRODUCTS, LIMITED PARTNERSHIP; SUROS SURGICAL SYSTEMS, INC ; Third Wave Technologies, INC; Gen-Probe Incorporated Hybrid imaging method to monitor medical device delivery and patient support for use in the method
7385708, Jun 07 2002 University of North Carolina at Chapel Hill Methods and systems for laser based real-time structured light depth extraction
7392076, Nov 04 2003 STRYKER EUROPEAN HOLDINGS III, LLC System and method of registering image data to intra-operatively digitized landmarks
7398116, Aug 11 2003 Veran Medical Technologies, Inc. Methods, apparatuses, and systems useful in conducting image guided interventions
7466303, Feb 10 2004 SUNNYBROOK HEALTH SCIENCES CENTRE Device and process for manipulating real and virtual objects in three-dimensional space
7480533, Jun 11 1999 Covidien AG; TYCO HEALTHCARE GROUP AG Ablation treatment of bone metastases
7505809, Jan 13 2003 ST JUDE MEDICAL INTERNATIONAL HOLDING S À R L Method and system for registering a first image with a second image relative to the body of a patient
7588541, Dec 10 2003 FUJIFILM SONOSITE, INC Method and system for positioning a medical device at one or more angles relative to an imaging probe
7596267, Feb 28 2003 MERATIVE US L P Image region segmentation system and method
7652259, Nov 04 2002 Spectrum Dynamics Medical Limited Apparatus and methods for imaging and attenuation correction
7662128, Dec 18 2003 Steerable needle
7678052, Apr 13 2004 General Electric Company Method and apparatus for detecting anatomic structures
7728868, Aug 02 2006 INNEROPTIC TECHNOLOGY, INC System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities
7747305, Jun 11 2003 Osteoplastics, LLC Computer-aided-design of skeletal implants
7797032, Oct 28 1999 SURGICAL NAVIGATION TECHNOLOGIES, INC Method and system for navigating a catheter probe in the presence of field-influencing objects
7798965, Aug 23 1999 G.E. Vingmed Ultrasound AS Method and apparatus for providing real-time calculation and display of tissue deformation in ultrasound imaging
7833168, Aug 13 2003 Best Medical International, Inc Targeted biopsy delivery system
7833221, Oct 22 2004 Ethicon Endo-Surgery, Inc. System and method for treatment of tissue using the tissue as a fiducial
7846103, Sep 17 2004 Medical Equipment Diversified Services, Inc. Probe guide for use with medical imaging systems
7876942, Mar 30 2006 STRYKER EUROPEAN HOLDINGS III, LLC System and method for optical position measurement and guidance of a rigid or semi-flexible tool to a target
7889905, May 23 2005 SRONCUB, INC Fast 3D-2D image registration method with application to continuously guided endoscopy
7912849, May 06 2005 Microsoft Corporation Method for determining contextual summary information across documents
7920909, Sep 13 2005 Veran Medical Technologies, Inc. Apparatus and method for automatic image guided accuracy verification
7962193, Sep 13 2005 Veran Medical Technologies, Inc. Apparatus and method for image guided accuracy verification
7976469, Jun 04 2007 Medtronic, Inc. Percutaneous needle guide
8023712, Dec 27 2007 Olympus Corporation Medical system and method for generating medical guide image
8038631, Jun 01 2005 Laparoscopic HIFU probe
8041413, Oct 02 2006 AURIS HEALTH, INC Systems and methods for three-dimensional ultrasound mapping
8050736, Sep 30 2003 Hologic, Inc; Biolucent, LLC; Cytyc Corporation; CYTYC SURGICAL PRODUCTS, LIMITED PARTNERSHIP; SUROS SURGICAL SYSTEMS, INC ; Third Wave Technologies, INC; Gen-Probe Incorporated Hybrid imaging method to monitor medical device delivery and patient support for use in the method
8052636, Jul 01 2005 AURIS HEALTH, INC Robotic catheter system and methods
8066644, May 17 2007 Vanderbilt University System, method and device for positioning a target located within soft tissue in a path of an instrument
8073528, Sep 30 2007 Intuitive Surgical Operations, Inc Tool tracking systems, methods and computer products for image guided surgery
8086298, Sep 29 2008 CIVCO MEDICAL INSTRUMENTS CO , INC EM tracking systems for use with ultrasound and other imaging modalities
8135669, Oct 13 2005 MICROSOFT INTERNATIONAL HOLDINGS, B V ; MICROSOFT INTERNATIONAL HOLDINGS B V Information access with usage-driven metadata feedback
8137281, Dec 20 2005 SHENZHEN MINDRAY BIO-MEDICAL ELECTRONICS CO , LTD Structure for attaching needle guide to ultrasound probe
8147408, Aug 31 2005 FUJIFILM SONOSITE, INC Medical device guide locator
8152724, Nov 11 2003 Soma Research, LLC Ultrasound guided probe device and method of using same
8167805, Oct 20 2005 OTSUKA MEDICAL DEVICES CO , LTD Systems and methods for ultrasound applicator station keeping
8216149, Aug 11 2005 Toshiba Medical Systems Corporation Puncture adaptor, ultrasonic probe for puncture, ultrasonic diagnostic apparatus for puncture, method for detecting angle of puncture needle
8221322, Jun 07 2002 VERATHON INC Systems and methods to improve clarity in ultrasound images
8228028, Jun 05 2007 Ascension Technology Corporation; ROPER ASCENSION ACQUISITION, INC Systems and methods for compensating for large moving objects in magnetic-tracking environments
8257264, Sep 02 2005 Ultrasound Ventures, LLC Ultrasonic probe with a needle clip and method of using same
8296797, Oct 19 2005 Microsoft Corporation Intelligent video summaries in information access
8340379, Mar 07 2008 INNEROPTIC TECHNOLOGY, INC Systems and methods for displaying guidance data based on updated deformable imaging data
8350902, Aug 02 2006 InnerOptic Technology, Inc. System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities
8482606, Aug 02 2006 InnerOptic Technology, Inc. System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities
8554307, Apr 12 2010 INNEROPTIC TECHNOLOGY, INC Image annotation in image-guided medical procedures
8585598, Feb 17 2009 InnerOptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image guided surgery
8670816, Jan 30 2012 InnerOptic Technology, Inc.; INNEROPTIC TECHNOLOGY, INC Multiple medical device guidance
8690776, Feb 17 2009 INNEROPTIC TECHNOLOGY, INC Systems, methods, apparatuses, and computer-readable media for image guided surgery
8831310, Mar 07 2008 InnerOptic Technology, Inc. Systems and methods for displaying guidance data based on updated deformable imaging data
9107698, Apr 12 2010 InnerOptic Technology, Inc. Image annotation in image-guided medical procedures
9282947, Dec 01 2009 INNEROPTIC TECHNOLOGY, INC Imager focusing based on intraoperative data
9364294, Feb 17 2009 InnerOptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
9398936, Feb 17 2009 InnerOptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image guided surgery
9659345, Aug 02 2006 InnerOptic Technology, Inc. System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities
9675319, Feb 17 2016 InnerOptic Technology, Inc. Loupe display
9901406, Oct 02 2014 INNEROPTIC TECHNOLOGY, INC Affected region display associated with a medical device
9949700, Jul 22 2015 InnerOptic Technology, Inc. Medical device approaches
20010007919,
20010016804,
20010031920,
20010041838,
20010045979,
20020010384,
20020032772,
20020049375,
20020077540,
20020077543,
20020103431,
20020105484,
20020135673,
20020138008,
20020140814,
20020156375,
20020198451,
20030032878,
20030040743,
20030073901,
20030135119,
20030163142,
20030164172,
20030231789,
20040034313,
20040078036,
20040095507,
20040116810,
20040147920,
20040152970,
20040181144,
20040215071,
20040238732,
20040243146,
20040243148,
20040249281,
20040249282,
20040254454,
20050010098,
20050085717,
20050085718,
20050090742,
20050107679,
20050111733,
20050159641,
20050182316,
20050192564,
20050219552,
20050222574,
20050231532,
20050240094,
20050251148,
20060004275,
20060020204,
20060036162,
20060052792,
20060058609,
20060058610,
20060058674,
20060058675,
20060100505,
20060122495,
20060184040,
20060193504,
20060229594,
20060235290,
20060235538,
20060241450,
20060253030,
20060253032,
20060271056,
20060282023,
20060293643,
20070002582,
20070016035,
20070024617,
20070032906,
20070073155,
20070073455,
20070078346,
20070167699,
20070167701,
20070167705,
20070167771,
20070167801,
20070225553,
20070239281,
20070244488,
20070255136,
20070270718,
20070276234,
20070291000,
20080004481,
20080004516,
20080030578,
20080039723,
20080051910,
20080091106,
20080114235,
20080146939,
20080161824,
20080183080,
20080200794,
20080208031,
20080208081,
20080214932,
20080232679,
20080287794,
20080287805,
20080287837,
20090024030,
20090036902,
20090105597,
20090118613,
20090118724,
20090131783,
20090137907,
20090196480,
20090234369,
20090312629,
20100045783,
20100152570,
20100185087,
20100198045,
20100208963,
20100268072,
20100268085,
20100296718,
20100298705,
20100305448,
20100312121,
20100331252,
20110043612,
20110046483,
20110046486,
20110057930,
20110082351,
20110130641,
20110137156,
20110201915,
20110201976,
20110208055,
20110230351,
20110237947,
20110238043,
20110251483,
20110274324,
20110282188,
20110288412,
20110295108,
20110301451,
20120035473,
20120059260,
20120071759,
20120078094,
20120101370,
20120108955,
20120138658,
20120143029,
20120143055,
20120165679,
20120237105,
20120259210,
20130030286,
20130044930,
20130079770,
20130090646,
20130096497,
20130132374,
20130151533,
20130178745,
20130218024,
20130249787,
20140016848,
20140051987,
20140058387,
20140078138,
20140180074,
20140201669,
20140275760,
20140275810,
20140275997,
20140343404,
20140350390,
20150238259,
20160166334,
20160166336,
20160196694,
20160270862,
20170024903,
20170065352,
20170099479,
20170128139,
20170323424,
20170348067,
20170360395,
20180116731,
20180289344,
20190021681,
20190060001,
20190167354,
20190180411,
20190216547,
20190223958,
20190247130,
20190321107,
20200046315,
AU1719601,
AU2001290363,
AU2003297225,
AU7656896,
AU9036301,
AU9453898,
BR113882,
CA2420382,
DE60126798,
EP427358,
EP1955284,
JP2005058584,
JP2005323669,
JP2009517177,
JP63290550,
JP7116164,
RE30397, Apr 02 1979 Three-dimensional ultrasonic imaging of animal soft tissue
RE37088, Aug 30 1994 Vingmed Sound A/S Method for generating anatomical M-mode displays
WO200317987,
WO1039683,
WO3032837,
WO3034705,
WO3105289,
WO5010711,
WO7019216,
WO7067323,
WO8017051,
WO9063423,
WO9094646,
WO10057315,
WO10096419,
WO11014687,
WO12169990,
WO13116240,
WO96005768,
WO97015249,
WO97017014,
WO97029682,
WO9926534,
/////
Executed onAssignorAssigneeConveyanceFrameReelDoc
Jan 19 2016STATE, ANDREIINNEROPTIC TECHNOLOGY, INC ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0447670455 pdf
Jan 19 2016KOHLI, LUVINNEROPTIC TECHNOLOGY, INC ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0447670455 pdf
Jan 19 2016RAZZAQUE, SHARIFINNEROPTIC TECHNOLOGY, INC ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0447670455 pdf
Jan 19 2016HEANEY, BRIANINNEROPTIC TECHNOLOGY, INC ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0447670455 pdf
Jan 29 2018InnerOptic Technology, Inc.(assignment on the face of the patent)
Date Maintenance Fee Events
Jan 29 2018BIG: Entity status set to Undiscounted (note the period is included in the code).
May 01 2024M1551: Payment of Maintenance Fee, 4th Year, Large Entity.


Date Maintenance Schedule
Nov 03 20234 years fee payment window open
May 03 20246 months grace period start (w surcharge)
Nov 03 2024patent expiry (for year 4)
Nov 03 20262 years to revive unintentionally abandoned end. (for year 4)
Nov 03 20278 years fee payment window open
May 03 20286 months grace period start (w surcharge)
Nov 03 2028patent expiry (for year 8)
Nov 03 20302 years to revive unintentionally abandoned end. (for year 8)
Nov 03 203112 years fee payment window open
May 03 20326 months grace period start (w surcharge)
Nov 03 2032patent expiry (for year 12)
Nov 03 20342 years to revive unintentionally abandoned end. (for year 12)