In the field of virtual reality, virtual tool manipulation systems and related methods and software are described in the present disclosure. One implementation of a virtual tool manipulation system, among others, comprises a motion tracking system configured to generate motion information related to the position of a part of a user's body. The virtual tool manipulation system also comprises a haptic feedback system configured to provide a haptic sensation to the user based on the motion information, the position of a virtual tool, and characteristics of the virtual tool.
|
1. A virtual tool manipulation system comprising:
a motion tracking system, including a first glove having a plurality of sensors and a plurality of actuators, a second glove having a plurality of sensors and a plurality of actuators, and an optical motion capture device, configured to generate motion information related to a user in a three dimensional space by measuring first and second hand position and orientation over time using the plurality of sensors in the first and second gloves, respectively, and head position and orientation over time using the optical motion capture device;
a visual feedback system configured to display a view of a virtual scene to the user based on head position and orientation in the three dimensional space, the virtual scene depicting a first virtual hand, a second virtual hand, a virtual tool, and a virtual subject, the head position and orientation defining an angle and line of sight of the view of the virtual scene, the visual feedback system further configured to change the view of the virtual scene when the optical motion capture device detects a change in the head position or orientation; and
a haptic feedback system configured to provide a haptic sensation to the user, using the plurality of actuators in the first and second gloves, based on first and second hand position and orientation in the three dimensional space, characteristics of the virtual subject including flexibility, strength and an ability to be cut or punctured, and characteristics of the virtual tool including an ability to cut or puncture the virtual subject, the haptic sensation simulating manipulating the virtual tool using the first hand of the user, interactions between the virtual tool and the first hand of the user, and interactions between the virtual subject and the second hand of the user.
7. A processing device comprising:
a memory device adapted to store a virtual tool manipulation program;
a microprocessor adapted to execute the virtual tool manipulation program;
a motion tracker interface adapted to receive motion information, related to a user in a three dimensional space, from a motion capture system including a first glove having a plurality of sensors and a plurality of actuators, a second glove having a plurality of sensors and a plurality of actuators, and an optical motion capture device, the motion information being generated by measuring first and second hand position and orientation over time using the plurality of sensors in the first and second gloves, respectively, and head position and orientation over time using the optical motion capture device;
a virtual display interface adapted to generate video signals for a visual feedback system, the video signals depicting a view of a virtual scene based on head position and orientation in the three dimensional space, the virtual scene depicting a first virtual hand, a second virtual hand, a virtual tool, and a virtual subject, the head position and orientation defining an angle and line of sight of the view of the virtual scene, the visual feedback system configured to change the view of the virtual scene when the optical motion capture device detects a change in the head position or orientation; and
a haptic device interface adapted to generate haptic feedback signals for controlling a haptic feedback device;
wherein the virtual tool manipulation program causes the microprocessor to simulate a surgical procedure and provide a haptic sensation to the user, using the plurality of actuators in the first and second gloves, based on first and second hand position and orientation in the three dimensional space, characteristics of the virtual subject including flexibility, strength and an ability to be cut or punctured, and characteristics of the virtual tool including an ability to cut or puncture the virtual subject, the haptic sensation simulating manipulating the virtual tool with the first hand of the user, interactions between the virtual tool and the first hand of the user, and interactions between a virtual subject and the second virtual hand of the user.
12. A virtual tool manipulation program stored on a tangible computer-readable medium, the virtual tool manipulation program comprising:
simulation data configured to define characteristics of a virtual tool, a first virtual hand, a second virtual hand, and a virtual subject;
logic configured to define one or more interactions between the virtual tool, the first virtual hand, the second virtual hand, and the virtual subject;
logic configured to receive motion information, related to a user in a three dimensional space, from a motion capture system including a first glove having a plurality of sensors and a plurality of actuators, a second glove having a plurality of sensors and a plurality of actuators, and an optical motion capture device, the motion information being generated by measuring first and second hand position and orientation over time using the plurality of sensors in the first and second gloves, respectively, and head position and orientation over time using the optical motion capture device;
logic configured to generate video signals for a visual feedback system, the video signals depicting a view of a virtual scene based on head position and orientation in the three dimensional space, the virtual scene depicting the first virtual hand, the second virtual hand, the virtual tool, and the virtual subject, the head position and orientation defining an angle and line of sight of the view of the virtual scene, the visual feedback system configured to change the view of the virtual scene when the optical motion capture device detects a change in the head position or orientation; and
logic configured to generate output information for controlling a haptic feedback system to provide a haptic sensation to the user based on first and second hand position and orientation in the three dimensional space, characteristics of the virtual subject including flexibility, strength and an ability to be cut or punctured, and characteristics of the virtual tool including an ability to cut or puncture the virtual subject;
wherein the haptic sensation simulates manipulating the virtual tool using the first hand of the user, interactions between the virtual tool and the first hand of the user, and interactions between the virtual subject and the second hand of the user.
2. The virtual tool manipulation system of
3. The virtual tool manipulation system of
4. The virtual tool manipulation system of
5. The virtual tool manipulation system of
6. The virtual tool manipulation system of
8. The processing device of
9. The processing device of
10. The processing device of
the virtual tool manipulation program comprises simulation files, interaction modules, and output processing modules;
the simulation files simulate the first virtual hand, the virtual tool, and the virtual subject;
the interaction modules simulate a first interaction between the first virtual hand and the virtual tool and a second interaction between the virtual tool and the virtual subject; and
the output processing modules generate signals to stimulate at least one sense of the user; and
wherein the at least one sense corresponds to the first interaction and the second interaction.
11. The processing device of
13. The virtual tool manipulation program of
the simulation data defining the virtual tool is configured to define a virtual surgical tool;
the simulation data defining the first virtual hand is configured to define a virtual surgeon; and
the simulation data defining the virtual subject is configured to define a virtual patient.
14. The virtual tool manipulation program of
15. The virtual tool manipulation program of
16. The virtual tool manipulation program of
17. The virtual tool manipulation program of
|
The present disclosure generally relates to virtual reality and more particularly relates to simulating tools that can be manipulated in a virtual environment.
During training, an apprentice learns how to use tools associated with a particular trade. Usually, the training process involves the apprentice, under supervision, practicing with the actual tools to conduct a specific action upon a subject. For example, the subject in this respect can be a living organism or an inanimate object, particularly dependent upon the type of tool being used. Tool manipulation skills can be developed, for instance, until the apprentice can become adequately proficient at the trade. Even in the medical field, for example, a medical student or surgeon-in-training learns the skills of handling specialized tools for performing different types of surgical or interventional procedures.
In medical training, the surgeon-in-training typically learns the art of tool handling on a cadaver, animal, or box-type trainer. However, in recent years, systems have been developed that allow a trainee to practice surgical procedures in a virtual environment where no real bodies are needed. In some virtual reality systems, the actual handle of a surgical tool is removed from the rest of the tool. Sensors are then attached to the surface of the handle to detect the position and orientation of the tool in a three-dimensional space. An interface allows the sensors to communicate with a computer and information related to how the handle is manipulated is transferred to the computer for further processing. Images of the tool in a virtual realm are displayed on a visual display device to simulate, in a visual sense, how an actual tool might affect the subject in reality.
One disadvantage of the conventional virtual reality system, in which sensors are attached to the surface of a handle of the tool, is that specific simulation hardware is required for each tool. Thus, it can be very expensive to configure the appropriate sensing hardware and software for a large number of different tools. In order to overcome these and other deficiencies of conventional systems, and to more realistically simulate tool-handling procedures, further improvements can still be made in the field of virtual reality involving the manipulation of tools. Not only can improved systems provide a trainee with more realistic training, but also improvements can be made to provide additional benefits, as well as improved tool design methodologies and rapid prototyping of new instruments.
The present disclosure describes systems, methods, and associated software applications for simulating virtual tools and for virtually handling the virtual tools. According to one embodiment among many described herein, a virtual tool manipulation system comprises a motion tracking system configured to generate motion information related to the position of a part of a user's body. The virtual tool manipulation system further comprises a haptic feedback system configured to provide a haptic sensation to the user based on the motion information, the position of a virtual tool, and characteristics of the virtual tool.
Other features, advantages, and implementations of the present disclosure, not expressly disclosed herein, will be apparent to one of ordinary skill in the art upon examination of the following detailed description and accompanying drawings. It is intended that such implied implementations of the present disclosure be included herein.
The components in the following figures are not necessarily drawn to scale. Instead, emphasis is placed upon clearly illustrating the general principles of the present disclosure. Reference characters designating corresponding components are repeated as necessary throughout the figures for the sake of consistency and clarity.
Virtual training for surgical procedures has provided a means for visually displaying images of what a surgery might actually look like in reality. In conventional systems, the handle of an actual surgical tool is removed from the tool and a number of sensors are placed on the handle to detect the position and orientation of the surgical tool in a virtual realm. However, the cost involved in creating a handle sensor device and associated software for interpreting movement of the device can be very expensive, especially when sensor devices and associated hardware and software is needed for each tool.
The present disclosure provides a lower cost alternative to the conventional systems by allowing any tool design, even new tool designs, to be simulated. More particularly, the design of a tool—either an existing tool or a design of a new tool—can be entered into memory using a computer-aided design (CAD) program. Using the design of a tool in additional to other relevant characteristics of the tool, e.g. weight, weight distribution, center of gravity, degrees of freedom, sharpness, resiliency, etc., the present system can provide haptic or kinesthetic feedback to the user to give the user the sensation of the tool's characteristics, as if the user were actually holding a real tool. The haptic feedback gives the user the sensation of position, movement, tension, etc. of the virtual tool.
By simulating the tool in the virtual world, there is no need to actually manufacture a prototype of the tool during design development. According to the teachings of the present disclosure, the user can experience in the virtual world how a tool functions and how it can be handled, while receiving feedback confirming the user's control of the virtual tool. With the ability to simulate tools in this way, a plurality of different types and designs of tools can be simulated and stored. The simulated virtual tools can have different handles, different sizes, shapes, forms, textures, etc. Also, depending on the type of tool being simulated, the tool can be virtually “gripped” in any suitable manner according to the particular design. As a result of either eliminating the need to manufacture a prototype or dramatically reducing the number of prototypes during the R&D phase, and eliminating the need to create associated sensing devices and software for each individual tool, the cost of including numerous tools in a training environment can be greatly reduced compared to systems in which sensing structure and related software for each physical tool must be configured.
Regarding new tool design according to the present disclosure, designs of tools can be entered into a computer and virtually manipulated. With haptic feedback provided to the user, the user is able to experience the sensation of actually holding the tool. Based on the ease of use and the feel of the tool for accomplishing its intended purpose, the design of the tool can then be altered as necessary without the need to manufacture the tool from inception. In this sense, since no physical material is needed, example embodiments described herein can avoid wasting manufacturing and material costs. The iterative process of designing a tool, utilizing the tool in the virtual realm, and tweaking the design of the tool, can result in a better tool design and can be accomplished in a shorter amount of time.
Virtual tool manipulation system 10, according to the teachings herein, can be configured for simulating any type of operation involving human movement, e.g. computer gaming, surgery, and involving any occupation, e.g. surgeon, aircraft pilot, astronaut, scientist, construction worker, factory worker, etc. Virtual tool manipulation system 10 can be used for training purposes to allow a specialist to practice a procedure in a safe training setting.
Motion tracking system 12 may include sensing devices for tracking the kinematics or position of certain points in three-dimensional space over time. The sensing device can also track the position or angle of these points with respect to each other, or using other motion tracking techniques. In particular, motion tracking system 12 is capable of making several measurements of position every second to simulate continual movement. In some embodiments, motion tracking system 12 does not include a corresponding physical element that the user can hold or touch in reality. Instead, in this case, the virtual tool exists only in the virtual realm and is completely virtual. In other embodiments, motion tracking system 12 includes a physical prop that the user can touch. The physical prop in this case can be, for example, a handle having the approximate size and shape as the virtual tool.
Motion tracking system 12, for example, may include a data glove, such as a CyberGlove™, which can be worn on the hand of a user. The data glove can include sensors for measuring the bending angles of several joints of the user's hand, wrist, and fingers. In some implementations, motion tracking system 12 may be capable of measuring x, y, and z positions in three-dimensional space with respect to a reference point and may further be capable of measuring orientation in space relative to a fixed reference frame. In other embodiments, motion tracking system 12 may include an optical motion capture device for tracking images of the user and calculating the positions and orientations of various user body parts, such as the hand or hands of a user, wrists, arms, shoulders, head, legs, feet, etc., if necessary.
Processing system 14 may include any suitable processing and storage components for managing the motion information measured by motion tracking system 12.
Haptic feedback system 16 may include any suitable device that provides any type of forced feedback, vibrotactile feedback, and/or tactile feedback to the user. This feedback is able to provide the user with the sensation of actually holding the simulated tool. Haptic feedback system 16 simulates the physical texture, pressures, forces, resistance, vibration, etc. of the tool, which can be related in some respects to responses to the tool's movement in space and including the interaction of the tool with the subject. Haptic feedback system 16 may include mechanisms such as the CyberGrasp™, CyberForce™, CyberTouch™, etc., for example, which can apply at least the sensations of force, weight, resistance, and vibration to the user.
Visual feedback system 18 may include any suitable virtual reality display device, such as virtual goggles, display screens, etc. Visual feedback system 18 can show the appearance of the tool and how the tool performs when handled by the virtual manipulator. Visual feedback system 18 may also show how the tool reacts to various forces or actions applied to the tool, and how the subject is affected by the virtual tool's movements or operations.
Generally, virtual tool manipulation system 10 operates as follows. Motion tracking system 12 tracks the motion of one or more parts, e.g. hands, of a user's body. The motion information is sent to processing system 14, which processes the motion information and determines how the motion affects a virtual tool. Also, processing system 14 determines how the virtual tool interacts with a particular subject. In response to these processing procedures, processing system 14 provides haptic feedback signals to haptic feedback system 16 based on interactions between the user and the virtual tool depending on the particular motion of the user and characteristics of the virtual tool. Also, processing system 14 provides virtual reality information to visual feedback system 18 to provide real-time visual simulation of the user's hand, tool being manipulated, and the effect of the tool's movement on the subject. In addition, processing system 14 may also create audio signals related to the interactions among the virtual manipulator, virtual tool, and virtual subject.
In some embodiments, user 20 does not actually grasp an actual tool or even a handle of an actual tool, which is unlike conventional systems that normally require the use of a tool handle altered to include sensors thereon. In other embodiments, user 20 may grasp a physical prop that resembles the size and/or shape of an actual tool handle. Although not specifically shown in this figure, motion tracking system 12 (
Also, although not specifically shown, user 20 and/or a data glove worn on the hand of user 20 can receive haptic feedback from haptic feedback system 16 (
Microprocessor 22 may be a general-purpose or specific-purpose processor or microcontroller. Memory 24 may include internally fixed storage and/or removable storage media for storing information, data, and/or instructions. The storage within the memory components may include any combination of volatile memory, such as random access memory (RAM), and/or non-volatile memory, such as read only memory (ROM). Memory 24 can also store a software program enabling microprocessor 22 to execute a virtual tool manipulation program or procedure. Various logical instructions or commands may be included in the software program for analyzing the user's movements and regulating feedback to the user based on the virtual interactions among the virtual human manipulator, the virtual tool, and the virtual subject. The virtual tool manipulation program of the present disclosure can be implemented in hardware, software, firmware, or a combination thereof. When implemented in software or firmware, the virtual tool manipulation program can be stored in memory 24 and executed by microprocessor 22. When implemented in hardware, the virtual tool manipulation program can be implemented, for example, using discrete logic circuitry, an application specific integrated circuit (ASIC), a programmable gate array (PGA), a field programmable gate array (FPGA), etc., or any combination thereof.
In the embodiment of
Memory 24 includes files that include information for simulating various portions of the virtual tool environment. For example, the simulation files may include a simulation of the tool itself, a simulation of the manipulator of the virtual tool, and a simulation of the subject that is impacted by the tool's operation. Memory 24 can also include software programs or code for defining how the virtual tool interacts with other things. For example, the interaction between the manipulator and tool as well as the interaction between the tool and the subject can be defined.
Input/output (I/O) devices 26 include input mechanisms such as keyboards, keypads, cursor control devices, e.g. computer mice, or other data entry devices. Output devices may include a computer monitor, display device, printer, or other peripheral devices. The I/O devices 26 may also include a device for communicating with a network, such as a modem, for allowing access to the network, such as the Internet. The I/O devices 26 can communicate with internal bus 34 via wired or wireless transmission.
Motion tracker interface 28 receives information received by motion tracking system 12 (
The virtual tool manipulation program 38 manages the simulation of the virtual tool, the simulation of the tool's manipulator, and the subject of the tool's operation, each stored as simulation files 44. The virtual tool manipulation program 38 also manages how the tool interacts with the manipulator and subject, depending on characteristics and properties of the manipulator, tool, and subject defined in simulation files 44. The manner in which they interact can be modeled and defined using software code and stored as interaction modules 46. Output processing modules 48 of virtual tool manipulation program 38 includes software code relating to how information is fed back to the user 20. Based on the virtual interactions of the manipulator, tool, and subject from interaction modules 46, output processing modules 48 provide at least haptic and visual feedback for simulating the virtual tool manipulation experience. Virtual tool manipulation program 38 and any other programs or software code including executable logical instructions as described herein can be embodied in any suitable computer-readable medium for execution by any suitable processing device. The computer-readable medium can include any physical medium that can store the programs or software code for a measurable length of time.
In other embodiments, manipulator simulation file 50 may include other information defining other parts of the human body. For example, if a tool to be simulated involves the manipulator pressing a pedal with his or her foot, then manipulator simulation file 50 includes information regarding various foot positions and orientations. In this respect, manipulator simulation file 50 may include data defining two or more parts of the human body. For some tools, two hands may be needed to utilize the tool properly. In this case, two files, one for each hand, can be modeled. Other implementations of manipulator simulation file 50 can account for variations in the size of the user. For example, the manipulation simulation file 50 may allow modifying the simulation information for a manipulator with large hands, for example.
Tool simulation file 52 stores information or data regarding the parameters, characteristics, and functions of the particular tool being simulated. The information in tool simulation file 52 can be downloaded from an external source or can be produced locally using the CAD program 40. The information includes simulation data of all parts of the virtual tool, including the manipulated parts, e.g. handles, buttons, switches, pulls, etc., and other parts of the structure of the tool, e.g. blades, clamping jaws, tool head, etc. Tool simulation file 52 also includes information regarding the characteristics of the tool, such as size, shape, form, sharpness, texture, force impact, vibration, inertia, center of gravity, weight, weight distribution, degrees of freedom, angles of motion, etc. Other characteristics such as resistance to movement, flexibility, inertia, etc. can also be used to define the tools.
Depending on the type of tool being simulated, tool simulation file 52 may differ significantly from one type to another. For example, virtual surgical tools could include scalpels, lancets, clamps, forceps, retractors, hemostats, dermatomes, endoscopes, laparoscopy tools, etc. Carpentry and construction tool simulated in the tool simulation file may include, for example, hammers, saws, drills, picks, crowbars, etc. Machine operation controls, such as machines for controlling construction equipment, vehicles, factory equipment, etc. can also be simulated and stored as a tool simulation file 52. It should be understood that tool simulation file 52 can include any amount of information for a number of simulated tools. For instance, for surgical training, a surgeon may need certain types of tools for certain procedures. These tools can be stored in related tool simulation files for access as needed. The characteristics of existing tools can be loaded in tool simulation file 52, or, as suggested above, new tools can be designed using appropriate software, such as CAD program 42 (
Subject simulation file 54 includes information or data defining the virtual subject that receives the actions of the virtual tool. The subject can be animate or inanimate, depending on the types of tools being simulated. For surgical tools, the subject would normally be animate, such as a human or, in the case of veterinary medicine, an animal. The anatomy and physiology of the human or animal can be modeled and stored in subject simulation file 54. The anatomy may include the body, skin, tissues, organs, bones, muscles, ligaments, tendons, etc. Other parts of the anatomy may also be incorporated in subject simulation file 54 as needed, depending on the type of virtual procedure the user is performing. Subject simulation file 54 may include characteristics of the subject, such as flexibility, strength, resistance to movement, inertia, resilience, ability to be cut or punctured, etc. Also, subject simulation file 54 can be implemented having a range of characteristics as needed. For example, differences in size, shape, age, sex, etc. of a patient can be modeled to better match the type of surgical procedure with an appropriate body type.
In some embodiments, a model can be adapted or derived from CAT scans, CT scans, x-rays, etc. of a specific patient to simulate a virtual representation of the patient to be operated on. In this sense, a surgeon could practice a surgical procedure on the virtual patient to better understand the real conditions of the patient and any risks or complications that may be involved. Also, subject simulation file 54 can include the differences in anatomy between a male patient and female patient if applicable to the particular surgical procedure.
In the fields of building construction or machine operation, the tools would normally be used on inanimate objects, such as lumber, drywall, beams, girders, nails, screws, rivets, or other building materials. Other subjects may include earth, dirt, rocks, concrete, etc. The characteristics of the subjects can be modeled to define the subject's size, shape, weight, flexibility, etc. In other fields in which virtual tools can be simulated and manipulated in a virtual world, other specific subjects receiving the action of the tools can be simulated to reflect a similar subject in reality.
Manipulator/tool interaction module 60 includes software programs or code for defining how the manipulator and tool interact with each other. Manipulator/tool interaction module 60 retrieves information from the manipulator simulator file 50 and tool simulation file 52 to determine the characteristics of each. Based on the movements of the manipulator and the characteristics of the tool, manipulator/tool interaction module 60 determines how the tool will move in the virtual realm and what types of haptic feedback the tool might provide to the manipulator. If a tool is grasped in an awkward manner, the haptic feedback can be helpful to the user in that it can be sensed whether or not the grip needs to be corrected. Also, the user might be able to better understand the feel of a tool from the haptic feedback. Also, depending on the flexibility of the manipulator with respect to the tool, certain forces or pressures may be determined to simulate the actual sensation of handling the tool. The movement and orientation of the tool depend on the forces applied by the manipulator and may also react to other forces such as gravity and inertia.
Tool/subject interaction module 62 includes software programs and/or code for defining how the tool and the subject interact with each other. Tool/subject interaction module 62 retrieves information from tool simulation file 52 and subject simulation file 54 to determine the characteristics of each. The interaction of the tool and subject may be based, for example, on the position, orientation, and movement of the tool. Also, based on the strength and other characteristics of the subject, the tool may experience some resistance in its movement. In some situations, this resistance can be applied to manipulator/tool interaction module 60 to translate this resistance to the manipulator. If this is the case, the manipulator may be required to apply additional force or change his or her movement as needed. The reaction of the subject to certain tool functions is also determined by tool/subject interaction module 62. For instance, when the subject is a virtual surgery patient, the use of a tool may result in bleeding or other reactions dependent upon the type of procedure performed on the subject and the portion of the subject encountering the effects of the tool. In a virtual construction setting, the subject may be virtual wood, for example, and the virtual tool may create virtual sawdust as a result of the tool interaction with the wood. These and other interactions between the tool and the subject can be programmed into tool/subject interaction module 62 to simulate a realistic reaction of the subject to the tool and any feedback that the subject may impose on the tool.
As a result of determining the interactions among the manipulator, tool, and subject, haptic feedback processing module 70 can provide feedback of certain forces, pressures, vibrations, etc. to the manipulator. Not only does haptic feedback processing module 70 derive haptic signals from manipulator/tool interaction module 60 and manipulator/subject interaction module 64, but also haptic feedback processing module 70 can derive indirect haptic signals from tool/subject interaction module 62 that may result from the interaction between the tool and the subject. Haptic signal processed in haptic feedback processing module 70 can be transmitted to haptic feedback system 16 (
The output processing modules 46 also include the video processing module 72, which generates the virtual reality video image signals. The video processing module 72 may include a graphics processing device for rendering the objects in the three-dimensional virtual world to a two-dimensional display screen. The video image signals generated by the video processing module 72 are transmitted to visual feedback system 18 (
It should be understood that the steps, processes, or operations described herein may represent any module or code sequence that can be implemented in software or firmware. In this regard, these modules and code sequences can include commands or instructions for executing specific logical steps, processes, or operations within physical components. It should further be understood that one or more of the steps, processes, and/or operations described herein may be executed substantially simultaneously or in a different order than explicitly described, as would be understood by one of ordinary skill in the art.
The embodiments described herein merely represent examples of implementations and are not intended to necessarily limit the present disclosure to any specific embodiments. Instead, various modifications can be made to these embodiments as would be understood by one of ordinary skill in the art. Any such modifications are intended to be included within the spirit and scope of the present disclosure and protected by the following claims.
Ullrich, Christopher J., Kunkler, Kevin J.
Patent | Priority | Assignee | Title |
11157131, | Feb 24 2017 | VRAD INC | Virtual reality-based radiology practice apparatus and method |
Patent | Priority | Assignee | Title |
4907973, | Nov 14 1988 | ETHICON, ENDO-SURGERY, INC | Expert system simulator for modeling realistic internal environments and performance |
4988981, | Mar 17 1987 | Sun Microsystems, Inc | Computer data entry and manipulation apparatus and method |
5143505, | Feb 26 1991 | Rutgers University | Actuator system for providing force feedback to a dextrous master glove |
5800179, | Jul 23 1996 | Medical Simulation Corporation | System for training persons to perform minimally invasive surgical procedures |
5828197, | Oct 25 1996 | IMMERSION CORPORATION DELAWARE CORPORATION | Mechanical interface having multiple grounded actuators |
5855553, | Feb 16 1995 | Hitachi, LTD | Remote surgery support system and method thereof |
6113395, | Aug 18 1998 | Selectable instruments with homing devices for haptic virtual reality medical simulation | |
6126450, | Feb 04 1998 | Mitsubishi Denki Kabushiki Kaisha | Medical simulator system and medical simulator notifying apparatus |
6324296, | Dec 04 1997 | PHASESPACE, INC | Distributed-processing motion tracking system for tracking individually modulated light points |
6377011, | Jan 26 2000 | Massachusetts Institute of Technology | Force feedback user interface for minimally invasive surgical simulator and teleoperator and other similar apparatus |
6421048, | Jul 17 1998 | 3D Systems, Inc | Systems and methods for interacting with virtual objects in a haptic virtual reality environment |
7084884, | Nov 03 1998 | Virtual Technologies, INC | Graphical object interactions |
7202851, | May 04 2001 | IMMERSION MEDICAL, INC | Haptic interface for palpation simulation |
20020133264, | |||
20020137014, | |||
20030001592, | |||
20030025723, | |||
20050162383, | |||
20050202384, | |||
20050203367, | |||
20050255434, | |||
20050289472, | |||
20060084050, | |||
20060190823, | |||
20080009771, | |||
20080015418, | |||
20080058836, | |||
JP11195140, | |||
JP2000259074, | |||
JP2001100879, | |||
JP2004318400, | |||
JP744312, | |||
WO118617, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Dec 12 2007 | ULLRICH, CHRISTOPHER J | IMMERSION MEDICAL, INC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 020332 | /0907 | |
Jan 04 2008 | KUNKLER, KEVIN J | IMMERSION MEDICAL, INC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 020332 | /0907 | |
Jan 08 2008 | Immersion Medical, Inc. | (assignment on the face of the patent) | / |
Date | Maintenance Fee Events |
Sep 20 2021 | REM: Maintenance Fee Reminder Mailed. |
Mar 07 2022 | EXP: Patent Expired for Failure to Pay Maintenance Fees. |
Date | Maintenance Schedule |
Jan 30 2021 | 4 years fee payment window open |
Jul 30 2021 | 6 months grace period start (w surcharge) |
Jan 30 2022 | patent expiry (for year 4) |
Jan 30 2024 | 2 years to revive unintentionally abandoned end. (for year 4) |
Jan 30 2025 | 8 years fee payment window open |
Jul 30 2025 | 6 months grace period start (w surcharge) |
Jan 30 2026 | patent expiry (for year 8) |
Jan 30 2028 | 2 years to revive unintentionally abandoned end. (for year 8) |
Jan 30 2029 | 12 years fee payment window open |
Jul 30 2029 | 6 months grace period start (w surcharge) |
Jan 30 2030 | patent expiry (for year 12) |
Jan 30 2032 | 2 years to revive unintentionally abandoned end. (for year 12) |