Systems and methods for providing tactile feedback. A method for providing tactile feedback, comprises extracting a plurality of features from a visual representation of a physical object, wherein the extracting is performed using at least one image processing technique, generating a variable frequency pattern corresponding to the extracted features, sending the variable frequency pattern to a computing device, and generating the tactile feedback via the computing device in response to a stimulation applied by a user, wherein the tactile feedback is based on the variable frequency pattern.

Patent
   9046926
Priority
Dec 17 2012
Filed
Apr 01 2013
Issued
Jun 02 2015
Expiry
May 27 2033
Extension
56 days
Assg.orig
Entity
Large
5
56
EXPIRED
14. A method for providing tactile feedback, comprising:
extracting a plurality of features from a visual representation of a physical object, wherein the extracting is performed using at least one image processing technique;
generating a variable frequency pattern corresponding to the extracted features;
sending the variable frequency pattern to a computing device;
assigning values within a range for each of the plurality of features, wherein the values correspond to actual characteristics of the physical object, wherein a value within the range corresponding to an actual characteristic is based on a degree of the actual characteristic;
weighting the values, wherein weights applied to the values are based on an importance of each value in connection with accurate identification of the extracted features of the physical object via the tactile feedback; and
generating the tactile feedback via the computing device in response to a stimulation applied by a user, wherein the tactile feedback is based on the variable frequency pattern.
1. A method for providing tactile feedback, comprising:
extracting a plurality of features from a visual representation of a physical object, wherein the extracting is performed using at least one image processing technique;
generating a variable frequency pattern corresponding to the extracted features;
sending the variable frequency pattern to a computing device;
assigning values within a range for each of the plurality of features, wherein the values correspond to actual characteristics of the physical object, wherein an actual characteristic includes at least texture of the physical object, and a value within the range corresponding to the texture is based on a degree of the texture;
weighting the values, wherein weights applied to the values are based on an importance of each value in connection with accurate identification of the extracted features of the physical object via the tactile feedback; and
generating the tactile feedback via the computing device in response to a stimulation applied by a user, wherein the tactile feedback is based on the variable frequency pattern.
2. The method of claim 1, further comprising:
segmenting the visual representation into a plurality of portions; and
calculating a frequency value for each of the plurality of portions based on the weighted values, wherein a weighted value for an actual characteristic of the physical object in a first portion of the plurality of portions differs from a weighted value for the actual characteristic in a second portion of the plurality of portions.
3. The method of claim 2, further comprising:
mapping the variable frequency pattern for each frequency value of the plurality of portions.
4. The method of claim 1, wherein the variable frequency pattern is sent to the computing device as meta data.
5. The method of claim 1, wherein the tactile feedback comprises a vibration corresponding to the variable frequency pattern.
6. The method of claim 1, wherein the extracted features comprise the texture and at least one of temperature, softness/hardness, moisture content, color and size.
7. The method of claim 1, wherein the computing device is a mobile device.
8. The method of claim 1, wherein the tactile feedback includes a multi-dimensional time-varying pattern of vibration.
9. The method of claim 1, further comprising:
segmenting the visual representation into a plurality of portions;
calculating a frequency value for each of the plurality of portions corresponding to the extracted features; and
mapping the variable frequency pattern for each frequency value of the plurality of portions.
10. The method of claim 1, further comprising:
segmenting the visual representation into a plurality of portions;
wherein a weighted value for an actual characteristic of the physical object in a first portion of the plurality of portions differs from a weighted value for the actual characteristic in a second portion of the plurality of portions.
11. The method of claim 1, wherein another actual characteristic includes at least softness/hardness of the physical object, and a value within the range corresponding to the softness/hardness is based on a degree of the softness/hardness.
12. The method of claim 1, further comprising:
segmenting the visual representation into a plurality of portions;
wherein the degree of the texture in a first portion of the plurality of portions and a corresponding range value based on the degree of the texture in the first portion differ from the degree of the texture in a second portion of the plurality of portions and a corresponding range value based on the degree of the texture in the second portion.
13. The method of claim 1, wherein the weights are dynamically learned.
15. The method of claim 14, wherein the weights are dynamically learned.

This application claims the benefit of U.S. Provisional Patent Application Ser. No. 61/738,005, filed on Dec. 17, 2012, the contents of which are herein incorporated by reference in their entirety.

The field generally relates to a system and method for providing the sense of touch in a computing device, and, in particular, to a system and method for dynamically generating a frequency pattern to realize the sense of touch by leveraging vibration.

Online and mobile commerce has been increasing at a rapid pace from year to year, especially with the increasing adoption of mobile devices, such as, for example, smartphones and tablets. Using, for example, mobile phone or tablet applications, mobile web sites, or a combination thereof, shoppers are able to use online stores or shopping sites to search for, compare, examine, and purchase items using their mobile devices.

Using mobile devices to purchase goods permits a consumer to make purchase decisions at any time and any place, at his or her convenience. However, shopping online or with a mobile device also prevents a consumer from examining an article or physical object in person, not allowing the consumer to touch, feel, or try using the product, all of which can presently only be done in a store or similar physical setting where both the product and purchaser are present.

In addition, when practicing remote or tele-medicine, a doctor remains unable to feel a wound without being co-located with a patient.

Some of the restrictions of mobile and online commerce, or remote medicine have been addressed. For example, high-resolution, full-color displays on higher end devices, and increasingly on mid- and lower-level devices, make it possible to present very realistic images of an item or medical condition. Audio makes it possible to hear an object (for example, an engine or a musical instrument), or to hear a sales associate explaining an item. In addition, video makes it possible to see an item in action, or being used.

Haptic technology, or haptics, permits a user to receive tactile feedback to some degree through a device by applying forces, vibrations, and/or motions to the user. However, with current technology, the mobile and online consumer, or remote doctor still remains unable to accurately feel or simulate the feeling of touching a wound, scar, or a desired product, such as an article of clothing, when making a purchase.

In general, exemplary embodiments of the invention include a system and method for providing the sense of touch in a computing device, and, in particular, to a system and method for dynamically generating a frequency pattern to realize the sense of touch by leveraging vibration.

According to an exemplary embodiment of the present invention, a system for providing tactile feedback comprises an image processing module capable of extracting a plurality of features from a visual representation of a physical object, wherein the extracting is performed using at least one image processing technique, and a frequency generation module capable of generating a variable frequency pattern corresponding to the extracted features, and sending the variable frequency pattern to a computing device, wherein the computing device generates the tactile feedback in response to a stimulation applied by a user, and the tactile feedback is based on the variable frequency pattern.

According to an exemplary embodiment of the present invention, a method for providing tactile feedback comprises extracting a plurality of features from a visual representation of a physical object, wherein the extracting is performed using at least one image processing technique, generating a variable frequency pattern corresponding to the extracted features, sending the variable frequency pattern to a computing device, and generating the tactile feedback via the computing device in response to a stimulation applied by a user, wherein the tactile feedback is based on the variable frequency pattern.

According to an exemplary embodiment of the present invention, an article of manufacture comprises a computer readable storage medium comprising program code tangibly embodied thereon, which when executed by a computer, performs method steps for providing tactile feedback, the method steps comprising extracting a plurality of features from a visual representation of a physical object, wherein the extracting is performed using at least one image processing technique, generating a variable frequency pattern corresponding to the extracted features, sending the variable frequency pattern to a computing device, and generating the tactile feedback via the computing device in response to a stimulation applied by a user, wherein the tactile feedback is based on the variable frequency pattern.

These and other exemplary embodiments of the invention will be described or become apparent from the following detailed description of exemplary embodiments, which is to be read in connection with the accompanying drawings.

Exemplary embodiments of the present invention will be described below in more detail, with reference to the accompanying drawings, of which:

FIG. 1 is a flow chart illustrating a method for providing tactile feedback according to an exemplary embodiment of the invention.

FIGS. 2A-2C are block diagrams illustrating image processing and frequency generation in a method for providing tactile feedback according to an exemplary embodiment of the invention.

FIG. 3 is a high-level block diagram illustrating a system for providing tactile feedback according to an exemplary embodiment of the invention.

FIG. 4 illustrates a computer system in accordance with which one or more components/steps of the techniques of the invention may be implemented, according to an exemplary embodiment of the invention.

Exemplary embodiments of the invention will now be discussed in further detail with regard to a system and method for providing the sense of touch in a computing device, and, in particular, to a system and method for dynamically generating a frequency pattern to realize the sense of touch by leveraging vibration. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein.

The embodiments of the present invention describe devices, systems and methods through which computers and mobile devices, such as, for example, smartphones and tablets, can be improved to transmit realistic haptic feelings to users. While some of the embodiments of the present invention are described in connection with mobile devices and commerce, the embodiments of the invention are not necessarily limited thereto. For example, the haptic accessory discussed herein can be applied to other types of computing devices, such as, for example, desktop and laptop computers, and personal digital assistants (PDAs), and can be applied to other fields, such as, for example, the medical field. In the case of the medical field, a doctor located remotely from a patient may be able to use the devices, systems and methods of the embodiments of the present invention to examine patients and simulate feeling of patients' wounds or scars while not being physically present with the patient. It is to be understood that embodiments of the present invention are not limited to the retail and healthcare fields.

Embodiments of the present invention provide systems and methods to dynamically generate frequency patterns to help realize the sense of touch remotely in mobile or other computing devices by leveraging vibration to feel, for example, the texture of merchandise (e.g., sense of touch for cloth or leather, etc.) or the nature of a wound for a given patient who is remotely located from a physician by leveraging vibration capabilities of the mobile or other computing device. As a result, retail stores can provide a better shopping experience to their customers by allowing consumers to shop from any part of the world without being physically present in the store, attract more customers to shop from their online retail store, and enhance mobile subscribers' experiences by enabling them to feel the merchandise. In addition, patients can be provided with remote treatments in a tele-medicine type of a scenario.

Embodiments of the present invention provide systems and methods for dynamically generating frequency patterns for any of the given objects considering, for example, such attributes as texture, temperature, softness/hardness, stiffness, moisture content, swelling, color, size, etc. by applying the techniques such as Digital Image Processing (DIP).

Referring to FIG. 1, which illustrates a method for providing tactile feedback to a user, according to an embodiment of the present invention, a customer/mobile subscriber navigates to an online commerce site to purchase an item, e.g., a shirt (block 102). After browsing an online catalog, the customer selects the item, e.g., the shirt, for further viewing and information (block 104). The selection may yield a visual representation of the product, which can include and image and/or video that is displayed for the user. As an alternative, an individual may be in a store, or next to a patient to be examined, take a picture of a product in the store, or a part of the body of the patient to be examined using a camera, for example, on a mobile device, and send the taken picture to a user in the place of the customer/mobile subscriber described above.

Online commerce sites may allow a customer to click on details of an item to receive further information about the product. For example, in the case of an apparel item, a customer may receive the following data: size, color, availability, price, store location, picture of the item, shipping time, etc., which is sent to the mobile or other computing device as metadata. In accordance with embodiments of the present invention, one or more features of the product (e.g., texture, temperature, softness/hardness, stiffness, moisture content, swelling, color, size, etc.) are extracted from a visual representation of the product using, for example, digital image processing (DIP), digital image capture (DIC) and/or magnetic resonance imaging (MRI) techniques (block 106). The visual representation (e.g. image) is physically segmented into portions (block 108), and based on the extracted features, a frequency pattern is mapped for each of the segments (block 110), which can vary based on the differences between the extracted features at each segment. In an alternative embodiment, a frequency pattern can be generated from an image file associated with the subject item and/or a description of the subject item. The frequency pattern is sent to the mobile device or other computing device as metadata (block 112). According to an embodiment of the present invention, the extracted features and segmentation result in time-varying data values (e.g. in the form of a varying frequency pattern) associated with each physical haptic property of the product. The frequency pattern metadata is not visible to the user of the mobile or other computing device, but is processed by the mobile or other computing device (block 114), through, for example, an application, and transmitted to, for example, a vibration producing component of the mobile or other computing device capable of receiving variable frequency patterns (block 116) to provide the necessary tactile feedback when the user interacts with the mobile or computing device (block 118). According to an embodiment, the mobile or other computing device responds to pressure applied thereto by, for example, the finger of a user. In accordance with embodiments of the present invention, a customer can touch a screen at the point of display of the product or other portion of the mobile or other computing device to feel a texture of the product. As a non-limiting example, used solely for purposes of illustration and not limitation, embodiments of the present invention may be used in conjunction with and/or applied to the embodiments disclosed in the U.S. patent application Ser. No. 13/864,666, which is assigned to International Business Machines Corporation of Armonk, N.Y., USA, and titled “Haptic Accessory And Methods For Using Same”, the complete disclosure of which is expressly incorporated herein by reference in its entirety for all purposes.

The pattern and amount of tactile feedback is a time-varying function of the time-varying data such that the pattern and amount of tactile feedback can be sufficient to identify the physical object. For example, the tactile feedback may have a multi-dimensional time varying pattern of vibration, such as, for example, a two-dimensional time-varying pattern of vibration. The mobile or other computing device converts the received variable frequency pattern to vibratory motion.

Referring to FIGS. 2A, 2B and 2C, steps 106, 108, 110 and 112 described in connection with FIG. 1 are described in more detail in connection with a scenario where an image of a wound/medical condition is captured by a patient using a mobile device. For example, referring to block 202, a patient who desires to show his wound to a physician remote from the patient takes a picture of the wound using a mobile device, and sends the image 208 via, for example, a network, to a remote server, for example, an online application server. As shown in FIG. 3, the remote server includes retrieval module 302 having an image processing module 304 and a frequency generation module 306. According to an embodiment, the frequency generation module 306 can be a dynamic frequency pattern generator.

Referring to blocks 204, 206, 208 210, as per step 106, one or more features of the wound (e.g., texture, temperature, softness/hardness, stiffness, moisture content, swelling, color, size, etc.) are extracted from the visual representation of the wound 208 by the image processing module 304 using, for example, DIP, DIC and/or MRI techniques. As per step 108, referring to the diagram at 212, using image processing techniques, the visual representation (e.g. image) 208 is physically segmented into portions based on location. For example, viewing from left to right, the image of the chin wound 208 is segmented into thirds I, II and III.

Referring to diagram 212 and Table 1 set forth below, as per block 110, the frequency generation module 306 maps a frequency pattern 213 corresponding to each of the extracted features for each of the segments I, II and III. In the diagram 212, the numbers 1-6 correspond to the numbers given to the attribute in Table 1.

Weight-
Attribute Range age
1. Soft Softness (0.1) ←---→Hardness (1) 0.5
2. Colour Pale Yellow (0.1) ←---→ Red (1) 0.3
3. Part of the Head (0.1) ←---→ Toe (1) 0.3
body
4. Moisture Wet (0.1) ←---→ Dry (1) 0.2
5. Texture Smooth (0.1) ←---→ Rough (1) 0.5
6. Temperature Cold (0.1) ←---→ Warm (0.5) ←---→ Hot (1) 0.3

Referring to Table 1, which depicts a mapping of attributes to values, attributes such as softness/hardness, color, location on body, moisture, texture and temperature, are assigned values within a range of 0.1 to 1 corresponding to their actual characteristics. For example, softer, yellower, wetter, smoother and colder portions and portions closer to the head are assigned values nearer to 0.1, and portions that are harder, redder, dryer, rougher, hotter, and closer to the toes are assigned values nearer to 1. As described further below, the values of the attributes can differ depending on the segment. In other words, attributes or features of the item may vary at different portions of the item and the corresponding visual representation of the item. In addition, each attribute is given a weight based on an importance in connection with accurately identifying and presenting the haptic features of the particular item. The attribute and weight values can be user-defined, dynamically learned through, for example, learning algorithms based on use and results (e.g., using historical data, prior knowledge about how tactile feedback has been presented in connection with a particular item type), and/or a combination thereof. Referring to FIG. 2C, the frequency generation module 306 takes the processed attributes for segments I, II and III 214a, b, c, and referring to block 216, determines the weightage for each of the attributes, calculates the frequency value at each segment I, II and III using the values of the attributes at each segment appropriately modified by their weightages, and maps the frequency pattern 213 for each of the segment values. Frequency pattern persistence 217 referenced in the FIG. 2C is a persistent mechanism, for example, a database where the generated frequency pattern is stored so that it can be used for pattern learning processes.

As can be seen, the frequency pattern 213 varies based on the differences between the extracted features at each segment. For example, referring to diagram 212, in segment II, values for elements of the wound, such as, for example, color, hardness and texture, are different from those values in segments I and III. Accordingly, the mapped frequency at segment II is different from the mapped frequency at segments I and III. As per block 112, the frequency pattern 213 is sent to the mobile device 310 or other computing device via network 308 as metadata.

Referring to FIG. 3, a system 300 for providing tactile feedback to a user, according to an embodiment of the present invention, includes a retrieval module 302, including the image processing module 304, which extracts one or more features representing characteristics of the product or wound (e.g., texture, temperature, softness/hardness, stiffness, moisture content, swelling, color, size, etc.) from a visual representation of an item (e.g., a consumer product or medical condition, such as a wound), and physically segments the visual representation (e.g. image) into portions. The visual representation can be derived from, for example, an online commerce site product offering, or a taken picture of a product or a part of the body of a patient that is uploaded to the retrieval module 302 and sent to a user of the mobile device 310.

The retrieval module 302 also includes a frequency generation module 306, which, based on the extracted features, maps a frequency pattern corresponding to each of the segments. The retrieval module 302 sends the frequency pattern to the mobile device 310 or other computing device as meta data, and sends the visual representation to the mobile device 310. According to an embodiment, the retrieval module 302 is located at a remote server, such as an online application server, accessible via a network.

According to an embodiment of the present invention, the extraction and segmentation is performed using DIP, digital image capture (DIC) and/or magnetic resonance imaging (MRI), the mapping is performed by a dynamic frequency generator. The mapped variable frequency pattern includes time-varying data values associated with each physical haptic property of the product. The frequency pattern meta data is sent to the mobile device 310 via a network 308, such as a local area network (LAN), wide area network (WAN), cellular network, satellite network or the Internet, and is processed by a frequency processing module 312 which is electrically connected to and/or incorporated into the mobile device 310. According to embodiments, communication between the retrieval module 302 and the mobile device 310 can be via, for example, multimedia messaging and/or based on Wi-Fi, BLUETOOTH, GSM/CDMA, 2G, 3G, 4G, etc. technologies. Based on the frequency pattern received from the frequency generation module 306, the mobile device 310 or other computing device is able to generate the appropriate wave-like vibration patterns and/or control temperature so that the mobile device 310 or other computing device provides the necessary tactile feedback when the user interacts with the mobile device 310 or other computing device, or an accessory electrically connected to thereto. According to an embodiment, the wave-like vibration pattern corresponds to the frequency of each segment so that a user is able to feel the differences in the attributes at each respective portion of the item.

As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, apparatus, method, or computer program product. Accordingly, aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module” or “system.” Furthermore, aspects of the present invention may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon.

Any combination of one or more computer readable medium(s) may be utilized. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.

A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.

Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.

Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the “C” programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).

Aspects of the present invention are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.

These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.

The computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.

FIGS. 1, 2A-2C and 3 illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer program products according to various embodiments of the present invention. In this regard, each block in a flowchart or a block diagram may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagram and/or flowchart illustration, and combinations of blocks in the block diagram and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.

One or more embodiments can make use of software running on a general-purpose computer or workstation. With reference to FIG. 4, in a computing node 410 there is a computer system/server 412, which is operational with numerous other general purpose or special purpose computing system environments or configurations. Examples of well-known computing systems, environments, and/or configurations that may be suitable for use with computer system/server 412 include, but are not limited to, personal computer systems, server computer systems, thin clients, thick clients, handheld or laptop devices, multiprocessor systems, microprocessor-based systems, set top boxes, programmable consumer electronics, network PCs, minicomputer systems, mainframe computer systems, and distributed cloud computing environments that include any of the above systems or devices, and the like.

Computer system/server 412 may be described in the general context of computer system executable instructions, such as program modules, being executed by a computer system. Generally, program modules may include routines, programs, objects, components, logic, data structures, and so on that perform particular tasks or implement particular abstract data types. Computer system/server 412 may be practiced in distributed cloud computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed cloud computing environment, program modules may be located in both local and remote computer system storage media including memory storage devices.

As shown in FIG. 4, computer system/server 412 in computing node 410 is shown in the form of a general-purpose computing device. The components of computer system/server 412 may include, but are not limited to, one or more processors or processing units 416, a system memory 428, and a bus 418 that couples various system components including system memory 428 to processor 416.

The bus 418 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. By way of example, and not limitation, such architectures include Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnects (PCI) bus.

The computer system/server 412 typically includes a variety of computer system readable media. Such media may be any available media that is accessible by computer system/server 412, and it includes both volatile and non-volatile media, removable and non-removable media.

The system memory 428 can include computer system readable media in the form of volatile memory, such as random access memory (RAM) 430 and/or cache memory 432. The computer system/server 412 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 434 can be provided for reading from and writing to a non-removable, non-volatile magnetic media (not shown and typically called a “hard drive”). Although not shown, a magnetic disk drive for reading from and writing to a removable, non-volatile magnetic disk (e.g., a “floppy disk”), and an optical disk drive for reading from or writing to a removable, non-volatile optical disk such as a CD-ROM, DVD-ROM or other optical media can be provided. In such instances, each can be connected to the bus 418 by one or more data media interfaces. As depicted and described herein, the memory 428 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the invention. A program/utility 440, having a set (at least one) of program modules 442, may be stored in memory 428 by way of example, and not limitation, as well as an operating system, one or more application programs, other program modules, and program data. Each of the operating system, one or more application programs, other program modules, and program data or some combination thereof, may include an implementation of a networking environment. Program modules 442 generally carry out the functions and/or methodologies of embodiments of the invention as described herein.

Computer system/server 412 may also communicate with one or more external devices 414 such as a keyboard, a pointing device, a display 424, etc., one or more devices that enable a user to interact with computer system/server 412, and/or any devices (e.g., network card, modem, etc.) that enable computer system/server 412 to communicate with one or more other computing devices. Such communication can occur via Input/Output (I/O) interfaces 422. Still yet, computer system/server 412 can communicate with one or more networks such as a local area network (LAN), a general wide area network (WAN), and/or a public network (e.g., the Internet) via network adapter 420. As depicted, network adapter 420 communicates with the other components of computer system/server 412 via bus 418. It should be understood that although not shown, other hardware and/or software components could be used in conjunction with computer system/server 412. Examples, include, but are not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data archival storage systems, etc.

Although illustrative embodiments of the present invention have been described herein with reference to the accompanying drawings, it is to be understood that the invention is not limited to those precise embodiments, and that various other changes and modifications may be made by one skilled in the art without departing from the scope or spirit of the invention.

Shanmugam, Dhandapani, Mohammed, Siddique

Patent Priority Assignee Title
10765929, Nov 12 2013 LNW GAMING, INC Reconfigurable playing card devices and related systems and methods
9218565, Dec 18 2013 International Business Machines Corporation Haptic-based artificial neural network training
9230208, Dec 18 2013 International Business Machines Corporation Haptic-based artificial neural network training
9530092, Dec 18 2013 International Business Machines Corporation Haptic-based artificial neural network training
9820186, Apr 07 2014 LG INNOTEK CO , LTD Communication module and data segmentation transmission method using the same
Patent Priority Assignee Title
5165897, Aug 10 1990 TiNi Alloy Company Programmable tactile stimulator array system and method of operation
5574576, May 21 1993 Coherent light driven display device
5984880, Jan 20 1998 Tactile feedback controlled by various medium
6402620, Dec 02 1998 TECHNOLOGY CREATIONS, INC Amplified stereo sound and force feed back accessory for video game devices
6424333, Nov 30 1995 Immersion Corporation Tactile feedback man-machine interface device
7046151, Jul 14 2003 Michael J., Dundon Interactive body suit and interactive limb covers
7227968, Jun 24 2002 SONION ROSKILDE A S Expandsible Receiver Module
7562816, Dec 18 2006 Toshiba Global Commerce Solutions Holdings Corporation Integrating touch, taste, and/or scent with a visual interface of an automated system for an enhanced user experience
7723896, Jan 20 2004 Japan Science and Technology Agency; Esashi, Masayoshi; Haga, Yoichi Driving mechanism using shape memory alloys including a magnetic latch
8040345, Nov 30 2006 3D Systems, Inc Systems for hybrid geometric/volumetric representation of 3D objects
8159461, Nov 01 2001 Immersion Corporation Method and apparatus for providing tactile sensations
8174372, Jun 26 2008 Immersion Corporation Providing haptic feedback on a touch surface
8201850, Apr 26 2008 GM Global Technology Operations LLC Adjustable belt tensioning utilizing active material actuation
8207832, Jun 25 2008 LG Electronics Inc. Haptic effect provisioning for a mobile communication terminal
8244211, Feb 07 2008 INFLEXIS, LLC Mobile electronic security apparatus and method
8253703, Mar 03 2009 VIBRANT LICENSING LLC Elastomeric wave tactile interface
8264465, Oct 08 2004 Immersion Corporation Haptic feedback for button and scrolling action simulation in touch input devices
8274376, Nov 17 2008 Canyon Ridge Resources, L.L.C. System and method for wireless control of medical devices
8369887, Jul 01 2008 LG Electronics Inc. Mobile terminal using proximity sensor and method of controlling the mobile terminal
8427433, Oct 17 2008 ADEMCO INC Tactile-feedback touch screen
8866600, Jan 20 2011 HTC Corporation Electronic apparatus with haptic feedback and method for providing haptic feedback
20020046020,
20020094755,
20020158836,
20020191011,
20050097179,
20050275567,
20060274036,
20080274769,
20080303782,
20100073328,
20100141407,
20100149092,
20100231541,
20100311489,
20100311490,
20110001616,
20110152729,
20110254671,
20120028577,
20120162117,
20120206248,
20120218091,
20120220311,
20120237043,
20120242687,
20130100046,
20130201115,
20130201141,
20130335354,
20140002376,
20140168114,
EP1521434,
JP10287189,
JP2002232317,
WO3014907,
///
Executed onAssignorAssigneeConveyanceFrameReelDoc
Apr 01 2013International Business Machines Corporation(assignment on the face of the patent)
Apr 01 2013MOHAMMED, SIDDIQUEInternational Business Machines CorporationASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0301210754 pdf
Apr 01 2013SHANMUGAM, DHANDAPANIInternational Business Machines CorporationASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0301210754 pdf
Date Maintenance Fee Events
Jan 21 2019REM: Maintenance Fee Reminder Mailed.
Jul 08 2019EXP: Patent Expired for Failure to Pay Maintenance Fees.
Aug 07 2019PTGR: Petition Related to Maintenance Fees Granted.


Date Maintenance Schedule
Jun 02 20184 years fee payment window open
Dec 02 20186 months grace period start (w surcharge)
Jun 02 2019patent expiry (for year 4)
Jun 02 20212 years to revive unintentionally abandoned end. (for year 4)
Jun 02 20228 years fee payment window open
Dec 02 20226 months grace period start (w surcharge)
Jun 02 2023patent expiry (for year 8)
Jun 02 20252 years to revive unintentionally abandoned end. (for year 8)
Jun 02 202612 years fee payment window open
Dec 02 20266 months grace period start (w surcharge)
Jun 02 2027patent expiry (for year 12)
Jun 02 20292 years to revive unintentionally abandoned end. (for year 12)