A data space in which a coordinate system is set having, as components, feature data extracted by approximation with respect to frequency spectrums of received ultrasonic waves, when a feature point of a frequency spectrum at a data position is present within a first area in an i-th frame (i is a positive integer) in a display and moves closer to a second area, having a lower priority than the first area, in subsequent (i+1)-th frame; an ultrasonic observation apparatus generates image data by setting a virtual feature point that is far off from the second area as compared to the latest feature point and that is within or around the first area, and by replacing visual information corresponding to the latest feature point in the (i+1)-th frame of the predetermined data position with visual information corresponding to the virtual feature point, and then displays the image data.
|
15. A non-transitory computer readable recording medium with an executable program stored thereon, wherein the program instructs a processor to perform:
calculating, by a frequency analyzing unit, that includes analyzing frequencies of the received ultrasonic sound waves and calculating a frequency spectrum;
extracting that includes performing approximation with respect to the frequency spectrum and extracting a single set or a plurality of sets of feature data of the frequency spectrum;
generating, by an image processing unit, that includes generating feature-data image data containing information related to the feature data; and
displaying, by a displaying unit, that includes displaying an image corresponding to the feature-data image data, wherein
the calculating that includes calculating the frequency spectrum up to the displaying that includes displaying the image is performed in a repeated manner, and
in a feature data space in which it is possible to set a coordinate system having at least some of the single set or the plurality of sets of feature data as coordinate components, when a feature point of a frequency spectrum at a particular data position is present within a predetermined first type area in an i-th frame (where i is a positive integer) in the display unit and moves closer to a second type area, which has a lower priority for image display than the first type area, in subsequent (i+1)-th frame, the feature-data image data is generated by setting a virtual feature point at a position that is far off from the second type area as compared to the position of latest feature point and that is within or around the first type area, and by replacing visual information corresponding to the latest feature point in the (i+1)-th frame of the predetermined data position with visual information corresponding to the virtual feature point.
1. An ultrasonic observation apparatus that transmits ultrasonic sound waves to a specimen and receives ultrasonic sound waves reflected from the specimen, the ultrasonic observation apparatus comprising:
a frequency analyzing unit that calculates frequency spectrums at a plurality of data positions which are set with respect to the ultrasonic sound waves that have been received;
a feature data extracting unit that performs approximation with respect to the frequency spectrums calculated by the frequency analyzing unit and extracts a single set or a plurality of sets of feature data of the frequency spectrums;
an image processing unit that sequentially generates feature-data image data containing information related to the feature data extracted by the feature data extracting unit; and
a display unit that sequentially displays images corresponding to the feature-data image data generated sequentially by the image processing unit, wherein
in a feature data space in which it is possible to set a coordinate system having at least some of the single set or the plurality of sets of feature data as coordinate components, when a feature point of a frequency spectrum at a particular data position is present within a predetermined first type area in an i-th frame (where i is a positive integer) in the display unit and moves closer to a second type area, which has a lower priority for image display than the first type area, in subsequent (i+1)-th frame,
the image processing unit generates the feature-data image data by setting a virtual feature point at a position that is far off from the second type area as compared to the position of latest feature point and that is within or around the first type area, and by replacing visual information corresponding to the latest feature point in the (i+1)-th frame of the predetermined data position with visual information corresponding to the virtual feature point.
14. An operation method of an ultrasonic observation apparatus that transmits ultrasonic sound waves to a specimen and receives ultrasonic sound waves reflected from the specimen, the operation method comprising:
calculating, by a frequency analyzing unit, that includes analyzing frequencies of the received ultrasonic sound waves and calculating a frequency spectrum;
extracting that includes performing approximation with respect to the frequency spectrum and extracting a single set or a plurality of sets of feature data of the frequency spectrum;
generating, by an image processing unit, that includes generating feature-data image data containing information related to the feature data; and
displaying, by a displaying unit, that includes displaying an image corresponding to the feature-data image data, wherein
the calculating that includes calculating the frequency spectrum up to the displaying that includes displaying the image is performed in a repeated manner, and
in a feature data space in which it is possible to set a coordinate system having at least some of the single set or the plurality of sets of feature data as coordinate components, when a feature point of a frequency spectrum at a particular data position is present within a predetermined first type area in an i-th frame (where i is a positive integer) in the display unit and moves closer to a second type area, which has a lower priority for image display than the first type area, in subsequent (i+1)-th frame, the feature-data image data is generated by setting a virtual feature point at a position that is far off from the second type area as compared to the position of latest feature point and that is within or around the first type area, and by replacing visual information corresponding to the latest feature point in the (i+1)-th frame of the predetermined data position with visual information corresponding to the virtual feature point.
2. The ultrasonic observation apparatus according to
3. The ultrasonic observation apparatus according to
the first representative point is a feature point determined from the feature data extracted by the feature data extracting unit, and
the second representative point is the virtual feature point.
4. The ultrasonic observation apparatus according to
the image processing unit determines positional relationship of the feature point that is used in generating the (i+1)-th frame with the second type area while considering, as a reference boundary, a straight line or a plane that is orthogonal to the reference axis passing through representative points of the first type area and the second type area and that passes through either one of a third feature point and a fourth feature point calculated as feature points at same data position,
the third representative point is a feature point determined from the feature data extracted by the feature data extracting unit, and
the fourth representative point is the virtual feature point.
5. The ultrasonic observation apparatus according to
6. The ultrasonic observation apparatus according to
7. The ultrasonic observation apparatus according to
8. The ultrasonic observation apparatus according to
an approximating unit that performs the approximation with respect to the frequency spectrums calculated by the frequency analyzing unit and extracts pre-correction feature data as feature data prior to performing the attenuation correction; and
an attenuation correcting unit that performs the attenuation correction with respect to the pre-correction feature data extracted by the approximating unit, and extracts feature data of the frequency spectrums.
9. The ultrasonic observation apparatus according to
10. The ultrasonic observation apparatus according to
11. The ultrasonic observation apparatus according to
an attenuation correcting unit that performs the attenuation correction with respect to the frequency spectrums; and
an approximating unit that performs the approximation with respect to the frequency spectrums corrected by the attenuation correcting unit, and extracts feature data of the frequency spectrums.
12. The ultrasonic observation apparatus according to
13. The ultrasonic observation apparatus according to
|
This application is a continuation of PCT international application Ser. No. PCT/JP2011/076605 filed on Nov. 11, 2011 which designates the United States, incorporated herein by reference, and which claims the benefit of priority from Japanese Patent Applications No. 2010-253291, filed on Nov. 11, 2010, incorporated herein by reference.
1. Field of the Invention
The present invention relates to an ultrasonic observation apparatus, an operation method of the ultrasonic observation apparatus, and a computer readable recording medium for enabling observation of tissues of a specimen using ultrasonic sound waves.
2. Description of the Related Art
Typically, in order to perform screening for breast cancer using ultrasonic sound waves, a technology called ultrasonic elastography is known (for example, see International Laid-open Pamphlet No. 2005/122906). The ultrasonic elastography is a technology which makes use of the fact that cancer tissues or tumor tissues inside a body have different hardness depending on the disease progression or depending on the body nature. In this technology, while continually applying external compression to the screening location, the strain amount or the degree of elasticity of the body tissues at the screening location is measured using ultrasonic sound waves, and the measurement result is displayed in the form of cross-sectional images.
An ultrasonic observation apparatus according to the present invention transmits ultrasonic sound waves to a specimen and receives ultrasonic sound waves reflected from the specimen, the ultrasonic observation apparatus comprising: a frequency analyzing unit that calculates frequency spectrums at a plurality of data positions which are set with respect to the ultrasonic sound waves that have been received; a feature data extracting unit that performs approximation with respect to the frequency spectrums calculated by the frequency analyzing unit and extracts a single set or a plurality of sets of feature data of the frequency spectrums; an image processing unit that sequentially generates feature-data image data containing information related to the feature data extracted by the feature data extracting unit; and a display unit that sequentially displays images corresponding to the feature-data image data generated sequentially by the image processing unit, wherein in a feature data space in which it is possible to set a coordinate system having at least some of the single set or the plurality of sets of feature data as coordinate components, when a feature point of a frequency spectrum at a particular data position is present within a predetermined first type area in an i-th frame (where i is a positive integer) in the display unit and moves closer to a second type area, which has a lower priority for image display than the first type area, in subsequent (i+1)-th frame, the image processing unit generates the feature-data image data by setting a virtual feature point at a position that is far off from the second type area as compared to the position of latest feature point and that is within or around the first type area, and by replacing visual information corresponding to the latest feature point in the (i+1)-th frame of the predetermined data position with visual information corresponding to the virtual feature point.
An operation method of an ultrasonic observation apparatus according to the present invention transmits ultrasonic sound waves to a specimen and receives ultrasonic sound waves reflected from the specimen, the operation method comprising: calculating, by a frequency analyzing unit, that includes analyzing frequencies of the received ultrasonic sound waves and calculating a frequency spectrum; extracting that includes performing approximation with respect to the frequency spectrum and extracting a single set or a plurality of sets of feature data of the frequency spectrum; generating, by an image processing unit, that includes generating feature-data image data containing information related to the feature data; and displaying, by a displaying unit, that includes displaying an image corresponding to the feature-data image data, wherein the calculating that includes calculating the frequency spectrum up to the displaying that includes displaying the image is performed in a repeated manner, and in a feature data space in which it is possible to set a coordinate system having at least some of the single set or the plurality of sets of feature data as coordinate components, when a feature point of a frequency spectrum at a particular data position is present within a predetermined first type area in an i-th frame (where i is a positive integer) in the display unit and moves closer to a second type area, which has a lower priority for image display than the first type area, in subsequent (i+1)-th frame, the feature-data image data is generated by setting a virtual feature point at a position that is far off from the second type area as compared to the position of latest feature point and that is within or around the first type area, and by replacing visual information corresponding to the latest feature point in the (i+1)-th frame of the predetermined data position with visual information corresponding to the virtual feature point.
A non-transitory computer readable recording medium according to the present invention has an executable program stored thereon, wherein the program instructs a processor to perform: calculating, by a frequency analyzing unit, that includes analyzing frequencies of the received ultrasonic sound waves and calculating a frequency spectrum; extracting that includes performing approximation with respect to the frequency spectrum and extracting a single set or a plurality of sets of feature data of the frequency spectrum; generating, by an image processing unit, that includes generating feature-data image data containing information related to the feature data; and displaying, by a displaying unit, that includes displaying an image corresponding to the feature-data image data, wherein the calculating that includes calculating the frequency spectrum up to the displaying that includes displaying the image is performed in a repeated manner, and in a feature data space in which it is possible to set a coordinate system having at least some of the single set or the plurality of sets of feature data as coordinate components, when a feature point of a frequency spectrum at a particular data position is present within a predetermined first type area in an i-th frame (where i is a positive integer) in the display unit and moves closer to a second type area, which has a lower priority for image display than the first type area, in subsequent (i+1)-th frame, the feature-data image data is generated by setting a virtual feature point at a position that is far off from the second type area as compared to the position of latest feature point and that is within or around the first type area, and by replacing visual information corresponding to the latest feature point in the (i+1)-th frame of the predetermined data position with visual information corresponding to the virtual feature point.
The above and other features, advantages and technical and industrial significance of this invention will be better understood by reading the following detailed description of presently preferred embodiments of the invention, when considered in connection with the accompanying drawings.
The patent or application file contains at least one drawing executed in color. Copies of this patent or patent application publication with color drawing(s) will be provided by the Office upon request and payment of the necessary fee.
Exemplary illustrative embodiments of the present invention (hereinafter, referred to as “embodiments”) are explained below in detail with reference to the accompanying drawings.
The ultrasonic observation apparatus 1 includes an ultrasonic probe 2 that outputs an ultrasonic pulse to the outside and receives an ultrasonic echo obtained by reflection on the outside; a transmitting-receiving unit 3 that transmits electrical signals to and receives electrical signals from the ultrasonic probe 2; a processing unit 4 that performs predetermined processing on electrical echo signals which are obtained by means of conversion of the ultrasonic echo; an image processing unit 5 that generates a variety of image data using the electrical echo signals which are obtained by means of conversion of the ultrasonic echo; an input unit 6 that is configured with an interface such as a keyboard, a mouse, or a touch-sensitive panel, and that receives input of a variety of information; a display unit 7 that is configured with a liquid crystal display panel or an organic EL display panel, and that displays a variety of information including the images generated by the image processing unit 5; a memory unit 8 that is used to store a variety of information including information related to the tissue characterizations of known specimens; and a control unit 9 that controls the operations of the ultrasonic observation apparatus 1.
The ultrasonic probe 2 converts electrical pulse signals that are received from the transmitting-receiving unit 3 into ultrasonic pulse (acoustic pulse signals), and includes a signal converting unit 21 for converting the ultrasonic echo that is obtained by reflection from an outside specimen into electrical echo signals. Meanwhile, the ultrasonic probe 2 can be configured to have an ultrasonic transducer performing scanning in a mechanical manner or can be configured to have a plurality of ultrasonic transducers performing scanning in an electronic manner.
The transmitting-receiving unit 3 is electrically connected to the ultrasonic probe 2. With that, the transmitting-receiving unit 3 transmits pulse signals to the ultrasonic probe 2 and receives echo signals representing reception signals from the ultrasonic probe 2. More particularly, based on a predetermined waveform and a predetermined transmission timing, the transmitting-receiving unit 3 generates pulse signals and transmits those pulse signals to the ultrasonic probe 2.
The transmitting-receiving unit 3 is electrically connected to the ultrasonic probe 2. With that, the transmitting-receiving unit 3 transmits pulse signals to the ultrasonic probe 2 and receives echo signals from the ultrasonic probe 2. More particularly, based on a predetermined waveform and a predetermined transmission timing, the transmitting-receiving unit 3 generates pulse signals and transmits those pulse signals to the ultrasonic probe 2. Moreover, the transmitting-receiving unit 3 performs operations such as amplification and filtering on received echo signals, performs A/D conversion of those echo signals to generate digital RF signals, and outputs those digital RF signals. Meanwhile, when the ultrasonic probe 2 is configured to have a plurality of ultrasonic transducers performing scanning in an electronic manner, the transmitting-receiving unit 3 is configured to include a multichannel circuit for performing beam synthesis corresponding to the ultrasonic transducers.
The processing unit 4 includes a frequency analyzing unit 41 that performs frequency analysis of echo signals by carrying out fast Fourier transformation (FFT) of the digital RF signals that are output by the transmitting-receiving unit 3; and includes a feature data extracting unit 42 that extracts feature data of the specimen by performing attenuation correction and approximation with respect to the frequency spectrum calculated by the frequency analyzing unit 41 so that there is a decrease in the contribution of attenuation, which occurs due to the reception depth and the frequency of ultrasonic sound waves being propagated.
The frequency analyzing unit 41 calculates a frequency spectrum with respect to each acoustic ray (line data) by performing fast Fourier transformation of an FFT data group having a predetermined volume of data. Depending on the tissue characterization of the specimen, the frequency spectrum demonstrates a different tendency. That is because of the fact that a frequency spectrum has a correlation with the size, the density, and the acoustic impedance of the specimen that serves as a scatterer which scatters the ultrasonic sound waves. Herein, for example, “tissue characterization” points to any one of a cancer, an endocrine tumor, a mucinous tumor, a normal tissue, and a vascular channel. If the specimen is pancreas, then chronic pancreatitis and autoimmune pancreatitis are also considered as tissue characterizations.
The feature data extracting unit 42 further includes an approximating unit 421, which performs approximation with respect to the frequency spectrum calculated by the frequency analyzing unit 41 and calculates pre-correction feature data that is the feature data prior to performing attenuation correction; and includes an attenuation correcting unit 422, which extracts feature data by performing attenuation correction with respect to the pre-correction feature data obtained by approximation by the approximating unit 421.
The approximating unit 421 performs linear approximation with respect to the frequency spectrum by means of regression analysis so as to extract feature data that characterizes the approximated linear expression. More particularly, by means of regression analysis, the approximating unit 421 calculates a gradient a0 and an intercept b0 of the linear expression, as well as calculates the intensity at a specific frequency within the frequency band of the frequency spectrum as the pre-correction feature data. In the first embodiment, it is assumed that, at the central frequency fMID=(fLOW+fHIGH)/2, the approximating unit 421 calculates c0=a0fMID+b0 as the intensity (Mid-band fit). However, that is only one example. Herein, the intensity points to any one parameter of parameters such as voltage, power, acoustic pressure, and acoustic energy.
Of the three components of feature data, the gradient a0 has a correlation with the size of the scatterer that scatters the ultrasonic sound waves. Generally, it is thought that larger the scatterer, smaller is the value of the gradient. The intercept b0 has a correlation with the size of the scatterer, the difference in acoustic impedances, and the density (consistency) of the scatterer. More particularly, it is thought that larger the scatterer, greater is the value of the intercept b0; greater the acoustic impedance, greater is the value of the intercept b0; and greater the density (concentration) of the scatterer, greater is the value of the intercept b0. The intensity c0 at the central frequency fMID (hereinafter, simply referred to as “intensity c0”) is an indirect parameter derived from the gradient a0 and the intercept b0, and represents the spectrum intensity at the center of the valid frequency band. Thus, it is thought that the intensity c0 has a correlation not only with the size of the scatterer, the difference in acoustic impedances, and the density of the scatterer, but also with the luminosity values of B-mode images to a certain extent. Meanwhile, the approximation polynomial calculated by the feature data extracting unit 42 is not limited to a linear expression. Alternatively, it is also possible to use an approximation polynomial of second-order or more.
The following explanation is given for the correction performed by the attenuation correcting unit 422. An attenuation amount A of ultrasonic sound waves can be expressed as:
A=2αzf (1)
where, α represents the attenuation rate, z represents the reception depth of ultrasonic sound waves, and f represents the frequency. As is clear from Equation (1), the attenuation amount A is proportional to the frequency f. Regarding a living body, the specific value of the attenuation rate α is in the range of 0 to 1.0 (dB/cm/MHz) and desirably is in the range of 0.3 to 0.7 (dB/cm/MHz), and is determined according to the organ to be observed. For example, if the organ to be observed is pancreas, then the attenuation rate α is set to 0.6 (dB/cm/MHz). Meanwhile, in the first embodiment, the configuration can also be such that the value of the attenuation rate α can be modified by an input from the input unit 6.
The attenuation correcting unit 422 corrects the pre-correction feature data (the gradient a0, the intercept b0, and the intensity c0), which has been calculated by the approximating unit 421, in the following manner:
a=a0+2αz (2)
b=b0 (3)
c=c0+2αzfMID(=afMID+b) (4)
As is clear from Equations (2) and (4) too, greater the reception depth of ultrasonic sound waves, greater is the amount of correction during the correction performed by the attenuation correcting unit 422. Meanwhile, with reference to Equation (3), the correction related to the intercept points to identical transformation. That is because of the fact that the intercept is a frequency component corresponding to the frequency 0 (Hz) and does not get attenuated.
The image processing unit 5 includes a B-mode image data generating unit 51 that generates B-mode image data from echo signals; and includes a feature-data image data generating unit 52 that generates feature-data image data containing information related to feature data.
The B-mode image data generating unit 51 generates B-mode image data by performing signal processing on digital signals using a known technology such as bandpass filtering, logarithmic conversion, gain processing, or contrast processing, and by performing data thinning according to the data step width that is decided in accordance to the display range of images in the display unit 7.
The feature-data image data generating unit 52 generates, in a temporally continuous manner, feature-data image data by referring to the B-mode image data generated by the B-mode image data generating unit 51 and by referring to the feature data extracted by the feature data extracting unit 42. More particularly, in a feature data space in which it is possible to set a coordinate system having at least some of the feature data extracted by the feature data extracting unit 42 as coordinate components, when a feature point of a frequency spectrum at a particular data position moves with the passage of time from a predetermined first type area closer to a second type area having low priority for image display than the first type area; the feature-data image data generating unit 52 sets a virtual feature point at a position that is far off from the second type area as compared to the position of the latest feature point and that is within or around the first type area, and assigns visual information corresponding to the virtual feature point to the same data position mentioned above so as to generate in a continuous manner the feature-data image data that contains information related to feature data. Herein, it is desirable that the first type area is set as an area corresponding to a tissue characterization of high importance such as cancer that should get detected during the observation.
In the first embodiment, with respect to the first type area, a virtual feature point is set as described above for the purpose of displaying a residual image. However, with respect to the second type area, no virtual feature point is set even in the case when a feature point of a frequency spectrum at a particular data position moves away from the second type area with the passage of time. In other words, the feature-data image data generating unit 52 performs residual image processing only with respect to the feature points present in the first type area in the feature data space.
The memory unit 8 includes a known-specimen information storing unit 81 that is used to store known specimen information including the feature data of known specimens; includes a window function storing unit 82 that is used to store a window function used during frequency analysis performed by the frequency analyzing unit 41; includes a correction information storing unit 83 that is used to store correction information which is referred to by the attenuation correcting unit 422 while performing attenuation correction; a feature-data-space information storing unit 84 that is used to store information related to feature data space which is set on the basis of the feature data of known specimens stored in the known-specimen information storing unit 81; and a feature data information storing unit 85 that is used to store information related to the feature data that is calculated as coordinate values of points in the feature data space which is stored in the feature-data-space information storing unit 84.
The known-specimen information storing unit 81 is used to store the feature data of frequency spectrums extracted for known specimens and the tissue characterizations of those known specimens in a corresponding manner. Herein, it is assumed that the feature data of a known specimen is extracted by performing an identical operation to that explained in the first embodiment. However, the feature data extracting operation for a known specimen need not be performed in the ultrasonic observation apparatus 1. Meanwhile, with respect to feature data of the frequency spectrum related to a known specimen, the known-specimen information storing unit 81 is also used to store the average and the standard deviation calculated for each group, which is classified on the basis of the information including the tissue characterization of that known specimen, along with all feature data of that known specimen. In the first embodiment, the average and the standard deviation of feature data of a frequency spectrum of ultrasonic reception signals reflect the changes at a cellular level such as enlargement or anomaly of the nucleus in the specimen or reflect the tissue-level changes such as fibrotic growth in the interstitium or substitution of parenchymal tissues with fibers. In consideration of the fact that unique values are indicated depending on the tissue characterization, the average and the standard deviation of feature data of the frequency spectrum of a known specimen are used to classify tissue characterizations.
The window function storing unit 82 is used to store at least one window function of the window functions such as Hamming, Hanning, and Blackman. The correction information storing unit 83 is used to store the information related to the conversion of Equations (2) to (4).
The feature-data-space information storing unit 84 is used to store a plurality of groups, which are obtained by classification on the basis of the feature data of a plurality of known specimens, and to store a representative point of each group as the information related to the feature data space that is set on the basis of the known specimen information stored in the known-specimen information storing unit 81. For example, a representative point can either be the average of feature data in the corresponding group or be the median point of feature data in the corresponding group.
The feature data information storing unit 85 stores therein feature points that are used at the time of calculating pixel values as visual information. Such feature points include the feature points extracted by the feature data extracting unit 42 as well as include residual image feature points that are virtual feature points determined according to the positions of feature points. Moreover, the feature data information storing unit 85 also stores, in a predetermined memory area thereof, a residual image area flag that takes different values depending on whether or not a feature point used at the time of calculating a pixel value is present in the residual image area. For example, when either a feature point present in the residual image area or a residual image feature point is used in calculating a pixel value, the feature data information storing unit 85 stores “1” as the value of the residual image area flag. On the other hand, when either a feature point present outside the residual image area or a residual image feature point is used in calculating a pixel value, the feature data information storing unit 85 stores “0” as the value of the residual image area flag.
Meanwhile, in
In the case illustrated in
Moreover, as the information about pixel values that is the visual information determined on a pixel-by-pixel basis, the feature-data-space information storing unit 84 stores therein the relationship between the points in the feature data space and the pixel values. For example, the feature-data-space information storing unit 84 stores therein the values of variables that constitute a color space and that are assigned to the intercept b and the intensity c. Herein, the color space points to a color system representing variables and representing the three attributes of light (hue, luminosity, and color intensity) of, for example, the RGB color system or a complementary color system.
Meanwhile, the memory unit 8 is put into practice with a ROM, which is used to store in advance operating programs of the ultrasonic observation apparatus 1 according to the first embodiment and to store programs for running a predetermined OS; and with a RAM, which is used to store operating parameters and data of each operation.
In the ultrasonic observation apparatus 1 having the abovementioned functional configuration, the constituent elements other than the ultrasonic probe 2 are put into practice with a computer that includes a CPU for performing processing and control. The CPU in the ultrasonic observation apparatus 1 reads, from the memory unit 8, the information and various programs including the operating programs of the ultrasonic observation apparatus 1; and performs processing related to the operation method of the ultrasonic observation apparatus 1 according to the first embodiment.
The operating programs of the ultrasonic observation apparatus 1 can also be recorded in a computer readable recording medium such as a hard disk, a flash memory, a CD-ROM, a DVD-ROM, or a flexible disk for the purpose of distribution.
With reference to
Subsequently, the B-mode image data generating unit 51 generates B-mode image data using echo signals for B-mode images output by the transmitting-receiving unit 3 (Step S3).
Subsequently, the control unit 9 performs control so that the display unit 7 displays a B-mode image corresponding to the B-mode image data generated by the B-mode image data generating unit 51 (Step S4).
Then, the frequency analyzing unit 41 performs frequency analysis by means of FFT and calculates a frequency spectrum (Step S5). Herein, the operation performed by the frequency analyzing unit 41 at Step S5 is explained in detail with reference to a flowchart illustrated in
Then, the frequency analyzing unit 41 calculates the frequency spectrum of all data positions set on a single acoustic ray. Regarding that, firstly, the frequency analyzing unit 41 sets an initial value Z0 of a data position Z (equivalent to reception depth) that is representative of a sequence of data groups (FFT data groups) obtained for the purpose of FFT (Step S22).
Then, the frequency analyzing unit 41 obtains the FFT data group at the data position Z (Step S23) and implements the window function, which is stored in the window function storing unit 82, to the FFT data group that has been obtained (Step S24). By implementing the window function to the FFT data group, it becomes possible to avoid discontinuity at the boundary in the FFT data group. As a result, artifacts can be prevented from occurring.
Subsequently, the frequency analyzing unit 41 determines whether or not the FFT data group at the data position Z is a normal data group (Step S25). Herein, it is necessary that the number of sets of data in an FFT data group is in power-of-two. In the following explanation, it is assumed that the number of sets of data in the FFT data group is 2n (where n is a positive integer). When an FFT data group is normal, it means that the data position Z is the 2n-1-th position from the front of the FFT data group. In other words, when an FFT data group is normal, it means that there are 2n-1−1 (=N) number of sets of data prior to the data position Z, and there are 2n-1 (=M) number of sets of data subsequent to the data position Z. In the example illustrated in
If the determination result of Step S25 indicates that the FFT data group at the data position Z is normal (Yes at Step S25), then the system control proceeds to Step S27 (described later).
If the determination result of Step S25 indicates that the FFT data group at the data position Z is not normal (No at Step S25), then the frequency analyzing unit 41 inserts zero data equivalent to the deficit and generates a normal FFT data group (Step S26). To the FFT data group that is determined to be not normal at Step S25, the window function is implemented prior to the addition of zero data. Hence, even if zero data is inserted, discontinuity in data does not occur. Once the operation at Step S26 is completed, the system control proceeds to Step S27.
At Step S27, the frequency analyzing unit 41 performs FFT using the FFT data groups and obtains the frequency spectrum (Step S27).
Subsequently, the frequency analyzing unit 41 adds a predetermined data step width D to the data position Z, and calculates the data position Z at the FFT data group to be analyzed next (Step S28). Herein, it is desirable that the data step width D is matched with the data step width used at the time when the B-mode image data generating unit 51 generates B-mode image data. However, when the object is to reduce the amount of operations in the frequency analyzing unit 41, it is also possible to set the data step width D to a larger value than the data step width used by the B-mode image data generating unit 51. In
Subsequently, the frequency analyzing unit 41 determines whether or not the data position Z is greater than a final data position Zmax (Step S29). Herein, the final data position Zmax can be set to the data length of the acoustic ray LD or to the data position corresponding to the lower edge of the area of concern. If the determination result indicates that the data position Z is greater than the final data position Zmax (Yes at Step S29), then the frequency analyzing unit 41 increments the acoustic ray number L by 1 (Step S30). On the other hand, if the determination result indicates that the data position Z is equal to or smaller than the final data position Zmax (No at Step S29), then the system control returns to Step S23. In this way, with respect to a single acoustic ray LD, the frequency analyzing unit 41 performs FFT for [{(Zmax−Z0)/D}+1] (=K) number of FFT data groups. Herein, [X] represents the largest integer not exceeding X.
If the acoustic number L that has been incremented at Step S30 is greater than a final acoustic number Lmax (Yes at Step S31), then the system control returns to the main routine illustrated in
In this way, the frequency analyzing unit 41 performs FFT for K number of times with respect to each of (Lmax−L0+1) number of acoustic rays. For example, the final acoustic ray number Lmax can be assigned to the final acoustic ray received by the transmitting-receiving unit 3 or to the acoustic ray corresponding to the boundary position on any one of the left and right sides of the area of concern. In the following explanation, the total number of times for which the frequency analyzing unit 41 performs FFT with respect to all acoustic rays is (Lmax−L0+1)×K and is referred to as “P”.
Subsequent to the frequency analyzing operation performed at Step S5 as described above, the approximating unit 421 performs, as an approximation operation, regression analysis of the P number of frequency spectrums calculated by the frequency analyzing unit 41 and extracts the pre-correction feature data (Step S6). More particularly, the approximating unit 421 performs regression analysis and calculates the linear expression for approximation of the frequency spectrums in the frequency band of fLOW<f<fHIGH; and then calculates the gradient a0, the intercept b0, and the intensity c0, which characterize the linear expression, as the pre-correction feature data. The straight line L1 illustrated in
Then, the attenuation correcting unit 422 performs attenuation correction of the pre-correction feature data extracted by the approximating unit 421 (Step S7). For example, when the data sampling frequency is 50 MHz, the time interval for data sampling is 20 (nsec). If the velocity of sound is assumed to be 1530 (m/sec), then the spacing among data sampling is equal to 1530 (m/sec)×20 (nsec)=0.0153 (mm). If “k” is assumed to be the number of data steps from the first set of data of the acoustic ray LD up to the data position of the FFT data group to be processed, then the data position Z thereof is equal to 0.0153 k (mm). The attenuation correcting unit 422 substitutes the value of the data position Z, which is obtained in the manner described above, in the reception depth z specified in Equations (2) to (4) mentioned above, and calculates the gradient a, the intercept b, and the intensity c.
I=af+b=(a0+2αZ)f+b0 (5)
As is clear from Equation (5), as compared to the straight line L1, the straight line L1′ has a greater gradient with the same intercept value.
Subsequently, the feature-data image data generating unit 52 generates feature-data image data (Step S8).
With reference to
Then, the feature-data image data generating unit 52 obtains a feature point (first feature point) Si(j) that has been extracted by the feature data extracting unit 42 (Step S42), reads the residual image area flag of that feature point Si(j) from the feature-data-space information storing unit 84, and determines the value of the residual image area flag (Step S43). If the residual image area flag is “0” (0 at Step S43), then the feature-data image data generating unit 52 calculates the pixel value of the pixel j using the feature point Si(j) (Step S44).
On the other hand, if the residual image area flag of the feature point Si(j) is “1” (1 at Step S43), then the feature-data image data generating unit 52 determines whether or not a residual image condition is satisfied (Step S45).
Then, the feature-data image data generating unit 52 determines whether or not a vector μ0S′, which has the representative point μ0 as the start point and the projected point S′i(j) as the end point, has the same direction as a vector μ0T′, which has the representative point μ0 as the start point and the projected point T′i-1(j) as the end point (Step S62). If the determination result indicates that the two vectors μ0S′ and μ0T′ have the same direction (Yes at Step S62), then the feature-data image data generating unit 52 calculates a distance dS between the projected point S′i(j) and the representative point μ0 as well as calculates a distance dT between the projected point T′i-1(j) and the representative point μ0 (Step S63).
Subsequently, the feature-data image data generating unit 52 compares the distance dS with the distance dT (Step S64). If the comparison result indicates that the distance dS is equal to or greater than the distance dT (Yes at Step S64), then the feature-data image data generating unit 52 determines that the residual image condition is satisfied (Step S65), and the system control proceeds to Step S46.
Meanwhile, if the comparison result at Step S64 indicates that the distance dS is smaller than the distance dT (No at Step S64), then the feature-data image data generating unit 52 determines that the residual image condition is not satisfied (Step S66), and the system control proceeds to Step S46.
Meanwhile, if the determination result indicates that the two vectors μ0S′ and μ0T′ do not have the same direction (No at Step S62), then the system control proceeds to Step S66.
Returning to the explanation with reference to
After the operation at Step S44 or Step S47 is completed, the feature-data image data generating unit 52 determines whether the feature point Si(j) or the residual image feature point Ti-1(j) used in calculating the pixel value is present in the residual image area (Step S48). When the feature point Si(j) or the residual image feature point Ti-1(j) is not present in the residual image area (No at Step S48), then the feature-data image data generating unit 52 sets the residual image area flag to “0” (Step S49).
Then, if the variable j for identifying pixel values has reached a maximum value jmax (Yes at Step S50), the feature-data image data generating unit 52 ends the feature-data image data generating operation. In contrast, if the variable j for identifying pixel values is smaller than the maximum value jmax (No at Step S50), the feature-data image data generating unit 52 increments the variable j by 1 (Step S51), and the system control returns to Step S42. Herein, the increment in the variable j means nothing more than a change in the pixel to be processed. Thus, there is no change in variables such as the feature point and the residual image feature point that are assigned to the pixels on an individual basis.
The following explanation is given for the case when, at Step S48, the feature point Si(j) or the residual image feature point Ti-1(j) is present in the residual image area (Yes at Step S48). In this case, the feature-data image data generating unit 52 sets the residual image area flag to “1” (Step S52), calculates a residual image feature point Ti(j) using the feature point Si(j), and stores the residual image feature point Ti(j) in the feature-data-space information storing unit 84 (Step S53). Then, the system control proceeds to Step S50.
Subsequently, the display unit 7 displays a feature data image generated by the feature-data image data generating unit 52 (Step S9).
Subsequently, when an instruction signal for instructing the end of operations is input via the input unit 6 (Yes at Step S10), then the ultrasonic observation apparatus 1 ends the sequence of operations. On the other hand, if no instruction signal for instructing the end of operations is input via the input unit 6 (No at Step S10), then the ultrasonic observation apparatus 1 increments the variable i for identifying the frames by 1 (Step S11), and the system control returns to Step S2.
Explained below is the effect of the first embodiment. Generally, as a target for observation, the ultrasonic observation apparatus 1 observes a slice plane having a uniform thickness inside the specimen.
Under the condition illustrated in
As illustrated in
In this way, in the first embodiment, while displaying feature data images, even in the frames in which tissues did not get displayed in the past, a residual image are drawn based on the tissue or based on the residual image displayed immediately before. Hence, even if a high-priority tissue is not observed in a particular frame; the image in the corresponding previous frame can be used to display a residual image, thereby making it possible to display the desired tissue over an extended period of time. As a result, the user can observe the desired images in a continuous manner.
According to the first embodiment of the present invention described above, in a feature data space, when a feature point of a frequency spectrum at a predetermined data position is present within a predetermined first type area in the i-th frame (where i is a positive integer) and moves closer to a second type area, which has a lower priority for image display than the first type area, in the subsequent (i+1)-th frame; feature-data image data that contains information related to feature data is generated by setting a virtual feature point at a position that is far off from the second type area as compared to the position of the latest feature point and that is within or around the first type area, and by replacing visual information corresponding to the latest feature point in the (i+1)-th frame of the predetermined data position with visual information corresponding to the virtual feature point. Then, images corresponding to the feature-data image data that has been generated are displayed in a sequential manner. With that, it becomes possible to display, for as long periods of time as possible, the images having pixel values corresponding to the feature points close to the area of high priority for image display. Therefore, even if the relative position relationship with the target of observation changes with time, the desired tissues can be observed over an extended period of time.
Moreover, according to the first embodiment, the feature data is extracted by performing attenuation correction with respect to pre-correction feature data of a frequency spectrum that has been obtained by analyzing the frequencies of received ultrasonic sound waves. That extracted feature data is used along with feature data of frequency spectrums that is extracted on the basis of ultrasonic sound waves reflected from a plurality of known specimens. Hence, without having to make use of the strain amount or the degree of elasticity of the body tissues, it becomes possible to make clear distinction between different tissues. As a result, tissue characterizations can be distinguished with accuracy and the measurement result can be enhanced in terms of reliability.
Moreover, according to the first embodiment, even at the time of obtaining the feature data of known specimens, attenuation correction is performed with respect to pre-correction feature data of frequency spectrums obtained by means of frequency analysis, and the feature data obtained by performing such attenuation correction is used as the index to classify and determine tissue characterizations. Hence, it becomes possible to make distinction between mutually different tissue characterizations. Particularly, in the first embodiment, the feature data extracted by performing attenuation correction is used. Therefore, as compared to the case of using feature data that is extracted without performing attenuation correction, the area of each group in the feature data space can be obtained in a more distinctly separated state.
During ultrasonic elastography, the pressure applied by pressing does not easily reach the inferior regions of vascular channels such as blood vessels or lymph vessels. For that reason, if a tumor is formed in the vicinity of a vascular channel, the boundary of the tumor remains ambiguous and it is difficult to identify the invasion of the tumor into the vascular channel. Hence, there are times when the specimen cannot be observed with accuracy. Moreover, during ultrasonic elastography, the amount of pressure or the pressing speed that gets applied while pressing the body part to be examined can easily differ from person to person who is conducting the examination. That leads to a low degree of reliability in the observation result. In that regard, according to the first embodiment, as described above, since a specimen is observed with a high degree of accuracy, it is possible to achieve enhancement in terms of reliability. As a result, it becomes possible to provide a technology that is suitable in resolving the issues specific to ultrasonic elastography.
In a second embodiment of the present invention, the feature-data image data generating operation performed by a feature-data image data generating unit is different than the first embodiment. The configuration of an ultrasonic observation apparatus according to the second embodiment is same as the configuration of the ultrasonic observation apparatus 1 according to the first embodiment. Thus, in the following explanation, the constituent elements identical to those in the ultrasonic observation apparatus 1 are referred to by the same reference numerals.
In the second embodiment, the feature data information storing unit 85 also stores therein a boundary feature point Ui(j), which is either the feature point Si(j) or the residual image feature point Ti(j) used in calculating the pixel value and which is used in the subsequent frame for the purpose of setting a boundary that is required at the time of determining the residual image condition in the feature data space.
Then, the feature-data image data generating unit 52 obtains the feature point (third feature point) Si(j) that has been extracted by the feature data extracting unit 42 (Step S72), reads the residual image area flag of that feature point Si(j) from the feature-data-space information storing unit 84, and determines the value of the residual image area flag (Step S73). If the residual image area flag is “0” (0 at Step S73), then the feature-data image data generating unit 52 calculates the pixel value of the pixel j using the feature point Si(j) (Step S74).
On the other hand, if the residual image area flag of the feature point Si(j) is “1” (1 at Step S73), then the feature-data image data generating unit 52 determines whether or not a residual image condition is satisfied (Step S75).
Then, the feature-data image data generating unit 52 determines whether or not the feature point Si(j) is positioned on the same side of the second type area with respect to the reference boundary (Step S92). If the feature point Si(j) is positioned on the same side of the second type area with respect to the reference boundary (Yes at Step S92), then the feature-data image data generating unit 52 determines that the residual image condition is satisfied (Step S93), and the system control proceeds to Step S76.
On the other hand, if the feature point Si(j) is not positioned on the same side of the second type area with respect to the reference boundary (No at Step S92), then the feature-data image data generating unit 52 determines that the residual image condition is not satisfied (Step S94), and the system control proceeds to Step S76.
Returning to the explanation with reference to
Subsequently, the feature-data image data generating unit 52 calculates the pixel value of the pixel j using the residual image feature point Ti(j) (Step S78).
Meanwhile, at Step S76, if the residual condition is not satisfied (No at Step S76); then the system control proceeds to Step S74.
After the operation at Step S74 or Step S78 is completed, the feature-data image data generating unit 52 determines whether the feature point Si(j) or the residual image feature point Ti(j) used in calculating the pixel value is present in the residual image area (Step S79). When the feature point Si(j) or the residual image feature point Ti(j) is not present in the residual image area (No at Step S79), then the feature-data image data generating unit 52 sets the residual image area flag to “0” (Step S80).
Then, if the variable j for identifying pixel values has reached the maximum value jmax (Yes at Step S81), the feature-data image data generating unit 52 ends the feature-data image data generating operation. In contrast, if the variable j for identifying pixel values is smaller than the maximum value jmax (No at Step S81), the feature-data image data generating unit 52 increments the variable j by 1 (Step S82), and the system control returns to Step S72. Herein too, the increment in the variable j means nothing more than a change in the pixel to be processed. Thus, there is no change in variables such as the feature point and the residual image feature point that are assigned to the pixels on an individual basis.
The following explanation is given for the case when, at Step S79, the feature point Si(j) or the residual image feature point Ti(j) is present in the residual image area (Yes at Step S79). In this case, the feature-data image data generating unit 52 sets the residual image area flag to “1” (Step S83); and stores the feature point Si(j) or the residual image feature point Ti(j), which is used in calculating the pixel value, as the boundary feature point Ui(j) in the feature data information storing unit 85 (Step S84). Then, the system control proceeds to Step S81.
According to the second embodiment of the present invention described above, in a feature data space, when a feature point of a frequency spectrum at a predetermined data position is present within a predetermined first type area in the i-th frame (where i is a positive integer) and moves closer to a second type area, which has a lower priority for image display than the first type area, in the subsequent (i+1)-th frame; feature-data image data that contains information related to feature data is generated by setting a virtual feature point at a position that is far off from the second type area as compared to the position of the latest feature point and that is within or around the first type area, and by replacing visual information corresponding to the latest feature point in the (i+1)-th frame of the predetermined data position with visual information corresponding to the virtual feature point. Then, images corresponding to the feature-data image data that has been generated are displayed in a sequential manner. With that, it becomes possible to display, for as long periods of time as possible, the images having pixel values corresponding to the feature points close to the area of high priority for image display. Therefore, even if the relative position relationship with the target of observation changes with time, the desired tissues can be observed over an extended period of time.
Moreover, according to the second embodiment, the feature data is extracted by performing attenuation correction with respect to pre-correction feature data of a frequency spectrum that has been obtained by analyzing the frequencies of received ultrasonic sound waves. That extracted feature data is used along with feature data of frequency spectrums that is extracted on the basis of ultrasonic sound waves reflected from a plurality of known specimens. Hence, without having to make use of the strain amount or the degree of elasticity of the body tissues, it becomes possible to make clear distinction between different tissues. As a result, tissue characterizations can be distinguished with accuracy and the measurement result can be enhanced in terms of reliability. Hence, it becomes possible to provide a technology that is suitable to ultrasonic elastography.
In a third embodiment of the present invention, the feature data extracting operation performed by a feature data extracting unit is different than the first embodiment. The configuration of an ultrasonic observation apparatus according to the third embodiment is same as the configuration of the ultrasonic observation apparatus 1 according to the first embodiment. Thus, in the following explanation, the constituent elements identical to those in the ultrasonic observation apparatus 1 are referred to by the same reference numerals.
During the feature data extracting operation according to the third embodiment, firstly, the attenuation correcting unit 422 performs attenuation correction with respect to the frequency spectrum calculated by the frequency analyzing unit 41. Then, the approximating unit 421 performs approximation with respect to the frequency spectrum that has been subjected to attenuation correction by the attenuation correcting unit 422, and extracts the feature data of the frequency spectrum.
At Step S106, the attenuation correcting unit 422 performs attenuation correction with respect to all frequency spectrums that are calculated by the frequency analyzing unit 41 by means of FFT (Step S106).
Subsequently, the approximating unit 421 performs regression analysis of all frequency spectrums that are subjected to attenuation correction by the attenuation correcting unit 422, and extracts the feature data of the frequency spectrums (Step S107). More particularly, the approximating unit 421 performs regression analysis and calculates the gradient a, the intercept b, and the intensity c at the central frequency fMID, which characterize the linear expression. A straight line L3 illustrated in
The operations performed at Step S108 to Step S111 are respectively identical to the operations performed at Step S8 to Step S11 illustrated in
According to the third embodiment of the present invention described above, in a feature data space, when a feature point of a frequency spectrum at a predetermined data position is present within a predetermined first type area in the i-th frame (where i is a positive integer) and moves closer to a second type area, which has a lower priority for image display than the first type area, in the subsequent (i+1)-th frame; feature-data image data that contains information related to feature data is generated by setting a virtual feature point at a position that is far off from the second type area as compared to the position of the latest feature point and that is within or around the first type area, and by replacing visual information corresponding to the latest feature point in the (i+1)-th frame of the predetermined data position with visual information corresponding to the virtual feature point. Then, images corresponding to the feature-data image data that has been generated are displayed in a sequential manner. With that, it becomes possible to display, for as long periods of time as possible, the images having pixel values corresponding to the feature points close to the area of high priority for image display. Therefore, even if the relative position relationship with the target of observation changes with time, the desired tissues can be observed over an extended period of time.
Moreover, according to the third embodiment, attenuation correction is performed with respect to a frequency spectrum that has been obtained by analyzing the frequencies of received ultrasonic sound waves, and feature data is extracted from the frequency spectrum that has been subjected to attenuation correction. That extracted feature data is used along with feature data of frequency spectrums that is extracted on the basis of ultrasonic sound waves reflected from a plurality of known specimens. Hence, without having to make use of the strain amount or the degree of elasticity of the body tissues, it becomes possible to make clear distinction between different tissues. As a result, tissue characterizations can be distinguished with accuracy and the measurement result can be enhanced in terms of reliability. Hence, it becomes possible to provide a technology that is suitable to ultrasonic elastography.
Additional advantages and modifications will readily occur to those skilled in the art. Therefore, the invention in its broader aspects is not limited to the specific details and representative embodiments shown and described herein. Accordingly, various modifications may be made without departing from the spirit or scope of the general inventive concept as defined by the appended claims and their equivalents.
Patent | Priority | Assignee | Title |
8917919, | May 30 2012 | Olympus Corporation | Ultrasonic observation apparatus, operation method of the same, and computer readable recording medium |
Patent | Priority | Assignee | Title |
6454713, | Mar 31 1995 | Kabushiki Kaisha Toshiba | Ultrasound therapeutic apparatus |
6875176, | Nov 28 2000 | Physiosonics, Inc | Systems and methods for making noninvasive physiological assessments |
20040019276, | |||
20040152983, | |||
20050203405, | |||
20070160275, | |||
JP2004049925, | |||
JP2005253827, | |||
JP2007097671, | |||
JP2007524431, | |||
JP2009523059, | |||
WO2004069027, | |||
WO2005122906, | |||
WO2007082218, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Jun 08 2012 | Olympus Medical Systems Corp. | (assignment on the face of the patent) | / | |||
Jul 30 2012 | EDA, HIROTAKA | Olympus Medical Systems Corp | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 028795 | /0866 | |
Apr 01 2015 | Olympus Medical Systems Corp | Olympus Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 036276 | /0543 | |
Apr 01 2016 | Olympus Corporation | Olympus Corporation | CHANGE OF ADDRESS | 039344 | /0502 |
Date | Maintenance Fee Events |
Nov 10 2016 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Nov 09 2020 | M1552: Payment of Maintenance Fee, 8th Year, Large Entity. |
Nov 13 2024 | M1553: Payment of Maintenance Fee, 12th Year, Large Entity. |
Date | Maintenance Schedule |
May 21 2016 | 4 years fee payment window open |
Nov 21 2016 | 6 months grace period start (w surcharge) |
May 21 2017 | patent expiry (for year 4) |
May 21 2019 | 2 years to revive unintentionally abandoned end. (for year 4) |
May 21 2020 | 8 years fee payment window open |
Nov 21 2020 | 6 months grace period start (w surcharge) |
May 21 2021 | patent expiry (for year 8) |
May 21 2023 | 2 years to revive unintentionally abandoned end. (for year 8) |
May 21 2024 | 12 years fee payment window open |
Nov 21 2024 | 6 months grace period start (w surcharge) |
May 21 2025 | patent expiry (for year 12) |
May 21 2027 | 2 years to revive unintentionally abandoned end. (for year 12) |