Provided is an ultrasound image display method. The ultrasound image display method includes displaying an ultrasound image of an object, selecting at least one region of interest (ROI) in the ultrasound image based on a user input, converting image pixel information corresponding to the at least one ROI into height values, and three-dimensionally displaying a partial ultrasound image corresponding to the at least one ROI by using the height values.
|
19. An ultrasound apparatus comprising:
a display configured to display an ultrasound image of an object, wherein the display displays a three-dimensional (3D) partial ultrasound image overlappingly on the ultrasound image which is a two-dimensional (2D) ultrasound image;
a user input unit configured to receive a selection of at least one region of interest (ROI) in the ultrasound image; and
a controller configured to convert image pixel information corresponding to the at least one ROI into height values having a sign, determined by the controller, based on movement direction information of a tissue or a bloodstream when the ultrasound image is a color doppler image or a spectral doppler image, and control the display to three-dimensionally display a partial ultrasound image corresponding to the at least one ROI by using the height values,
wherein the controller applies a semitransparent effect or a transparent effect to other images in the ultrasound image which is the 2D ultrasound image except a region in which the 3D partial ultrasound image is overlappingly displayed.
1. An ultrasound image display method comprising:
displaying an ultrasound image of an object;
selecting at least one region of interest (ROI) in the ultrasound image based on a user input;
converting image pixel information corresponding to the at least one ROI into height values, wherein the converting of the image pixel information into the height values comprises determining a sign of the height values based on movement direction information of a tissue or a bloodstream when the ultrasound image is a color doppler image or a spectral doppler image; and
three-dimensionally displaying a partial ultrasound image corresponding to the at least one ROI by using the height values,
wherein the three-dimensional displaying of the partial ultrasound image corresponding to the at least one ROI comprises displaying the three-dimensional (3D) partial ultrasound image overlappingly on the ultrasound image which is a two-dimensional (2D) form, and
wherein the displaying of the 3D partial ultrasound image overlappingly on the ultrasound image which is the 2D form comprises applying a semitransparent effect or a transparent effect to other images in the ultrasound image which is the 2D form except a region in which the 3D partial ultrasound image is overlappingly displayed.
2. The ultrasound image display method of
3. The ultrasound image display method of
4. The ultrasound image display method of
5. The ultrasound image display method of
6. The ultrasound image display method of
7. The ultrasound image display method of
8. The ultrasound image display method of
9. The ultrasound image display method of
10. The ultrasound image display method of
11. The ultrasound image display method of
receiving a selection of at least one rendering method among a plurality of rendering methods; and
three-dimensionally displaying the partial ultrasound image by the selected rendering method.
12. The ultrasound image display method of
the at least one ROI comprises a plurality of ROIs, and
the three-dimensional displaying of the partial ultrasound image corresponding to the at least one ROI comprises three-dimensionally displaying a plurality of partial ultrasound images corresponding respectively to the plurality of ROIs.
13. The ultrasound image display method of
14. The ultrasound image display method of
adjusting sizes of the plurality of partial ultrasound images to a predetermined size; and
displaying the plurality of size-adjusted partial ultrasound images overlappingly on the ultrasound image.
15. The ultrasound image display method of
16. The ultrasound image display method of
17. The ultrasound image display method of
18. A non-transitory computer-readable recording medium that stores a program that, when executed by a computer, performs the ultrasound image display method of
20. The ultrasound apparatus of
21. The ultrasound apparatus of
22. The ultrasound apparatus of
23. The ultrasound apparatus of
24. The ultrasound apparatus of
25. The ultrasound apparatus of
26. The ultrasound apparatus of
the user input unit receives a selection of at least one rendering method among a plurality of rendering methods, and
the display three-dimensionally displays the partial ultrasound image by the selected rendering method.
27. The ultrasound apparatus of
the at least one ROI comprises a plurality of ROIs, and
the display three-dimensionally displays a plurality of partial ultrasound images corresponding respectively to the plurality of ROIs.
28. The ultrasound apparatus of
29. The ultrasound apparatus of
30. The ultrasound apparatus of
|
This application claims the benefit of Korean Patent Application No. 10-2014-0010884, filed on Jan. 28, 2014, in the Korean Intellectual Property Office, the disclosure of which is incorporated herein in its entirety by reference.
1. Field
One or more exemplary embodiments relate to ultrasound image display methods and ultrasound apparatuses for displaying an ultrasound image corresponding to a region of interest (ROI) as a three-dimensional (3D) image having height values.
2. Description of the Related Art
An ultrasound diagnosis apparatus transfers an ultrasound signal from a body surface of an object to a predetermined region in a body of the object, and obtains a tomogram of a soft tissue or an image of a bloodstream by using information of an ultrasound signal reflected from a tissue in the body.
The ultrasound diagnosis apparatus is small and inexpensive and may display images in real time. Also, since the ultrasound diagnosis apparatus has high safety due to no exposure to X-rays or the like, the ultrasound diagnosis apparatus is widely used along with other image diagnosis apparatuses such as an X-ray diagnosis apparatus, a computerized tomography (CT) scanner, a magnetic resonance imaging (MRI) apparatus, and a nuclear medicine diagnosis apparatus.
Since values measured by the ultrasound diagnosis apparatus are closely related to a lesion diagnosis or the like, the accuracy of the values is required. Therefore, there is a need for a system that enables a user to accurately understand an ultrasound image.
One or more exemplary embodiments include ultrasound image display methods and ultrasound apparatuses for displaying an ultrasound image corresponding to a region of interest (ROI) as a three-dimensional (3D) image having a height value.
Additional aspects will be set forth in part in the description which follows and, in part, will be apparent from the description, or may be learned by practice of the presented exemplary embodiments.
According to one or more exemplary embodiments, an ultrasound image display method includes: displaying an ultrasound image of an object; selecting at least one ROI in the ultrasound image based on a user input; converting image pixel information corresponding to the at least one ROI into height values; and three-dimensionally displaying a partial ultrasound image corresponding to the at least one ROI by using the height values.
The ultrasound image may include at least one of a brightness (B) mode image, a color Doppler image, a spectral Doppler image, a tissue Doppler image, an elasticity image, and a motion (M) mode image.
The image pixel information may include at least one of a brightness value, a speed value, a color value, an elasticity value, an amplitude value of a sound reflection signal, and a sound impedance value.
The selecting of the at least one ROI may include selecting an ROI of a predetermined size based on a point selected by a user.
The selecting of the at least one ROI may include selecting a region, which has a predetermined similarity value or more with respect to pattern information of a point selected by a user, in the ultrasound image as the ROI.
The converting of the image pixel information into the height values may include determining a sign of the height values based on movement direction information of a tissue or a bloodstream when the ultrasound image is a color Doppler image or a spectral Doppler image.
The three-dimensional displaying of the partial ultrasound image corresponding to the at least one ROI may include two-dimensionally displaying other images in the ultrasound image except the partial ultrasound image corresponding to the at least one ROI.
The three-dimensional displaying of the partial ultrasound image corresponding to the at least one ROI may include displaying the three-dimensional (3D) partial ultrasound image overlappingly on the ultrasound image which is a two-dimensional (2D) form.
The displaying of the 3D partial ultrasound image overlappingly on the ultrasound image which is the 2D form may include applying a semitransparent effect or a transparent effect to other images in the ultrasound image which is the 2D form except a region in which the 3D partial ultrasound image is overlappingly displayed.
The three-dimensional displaying of the partial ultrasound image corresponding to the at least one ROI may include three-dimensionally displaying the partial ultrasound image in a region different from a region in which the ultrasound image is displayed.
The ultrasound image display method may further include adjusting at least one of a position, a rotation angle, and a size of the three-dimensionally displayed partial ultrasound image.
The ultrasound image display method may further include displaying at least one of an average height value, a maximum height value, a minimum height value, and a variance value of the partial ultrasound image.
The three-dimensional displaying of the partial ultrasound image corresponding to the at least one ROI may include three-dimensionally displaying the partial ultrasound image by using a light source-based rendering method.
The three-dimensional displaying of the partial ultrasound image corresponding to the at least one ROI may include: receiving a selection of at least one rendering method among a plurality of rendering methods; and three-dimensionally displaying the partial ultrasound image by the selected rendering method.
The at least one ROI may include a plurality of ROIs, and the three-dimensional displaying of the partial ultrasound image corresponding to the at least one ROI may include three-dimensionally displaying a plurality of partial ultrasound images corresponding respectively to the plurality of ROIs.
The three-dimensional displaying of the plurality of partial ultrasound images corresponding respectively to the plurality of ROIs may include displaying the plurality of partial ultrasound images corresponding respectively to the plurality of ROIs in a side view.
The three-dimensional displaying of the plurality of partial ultrasound images corresponding respectively to the plurality of ROIs may include: adjusting sizes of the plurality of partial ultrasound images to a predetermined size; and displaying the plurality of size-adjusted partial ultrasound images overlappingly on the ultrasound image.
The three-dimensional displaying of the plurality of partial ultrasound images corresponding respectively to the plurality of ROIs may include three-dimensionally displaying a difference image between the plurality of partial ultrasound images corresponding respectively to the plurality of ROIs.
The selecting of the at least one ROI may include selecting an ROI having at least one of an elliptical shape, a tetragonal shape, and a free curve shape.
The three-dimensional displaying of the partial ultrasound image corresponding to the at least one ROI may include displaying the partial ultrasound image in at least one form of a gray scale map, a color scale map, a contour line map, a contour surface map, and a numerical value map.
According to one or more exemplary embodiments, an ultrasound apparatus includes: a display configured to display an ultrasound image of an object; a user input unit configured to receive a selection of at least one ROI in the ultrasound image; and a controller configured to convert image pixel information corresponding to the at least one ROI into height values and control the display to three-dimensionally display a partial ultrasound image corresponding to the at least one ROI by using the height values.
The controller may select an ROI of a predetermined size based on a point selected by a user.
The controller may select a region, which has a predetermined similarity value or more with respect to pattern information of a point selected by a user, in the ultrasound image as the ROI.
The controller may determine a sign of the height values based on movement direction information of a tissue or a bloodstream when the ultrasound image is a color Doppler image or a spectral Doppler image.
The display may display the three-dimensional (3D) partial ultrasound image overlappingly on the two-dimensional (2D) ultrasound image.
The controller may apply a semitransparent effect or a transparent effect to other images in the ultrasound image except a region in which the 3D partial ultrasound image is overlappingly displayed.
The display may three-dimensionally display the partial ultrasound image in a region different from a region in which the ultrasound image is displayed.
The controller may adjust at least one of a position, a rotation angle, and a size of the three-dimensionally displayed partial ultrasound image.
The display may further display at least one of an average height value, a maximum height value, a minimum height value, and a variance value of the partial ultrasound image.
The display may three-dimensionally display the partial ultrasound image by using a light source-based rendering method.
The user input unit may receive a selection of at least one rendering method among a plurality of rendering methods, and the display may three-dimensionally display the partial ultrasound image by the selected rendering method.
The at least one ROI may include a plurality of ROIs, and the display may three-dimensionally display a plurality of partial ultrasound images corresponding respectively to the plurality of ROIs.
The display may display the plurality of partial ultrasound images in a side view.
The controller may adjust sizes of the plurality of partial ultrasound images to a predetermined size and control the display to display the plurality of size-adjusted partial ultrasound images overlappingly on the ultrasound image.
The display may three-dimensionally display a difference image between the plurality of partial ultrasound images corresponding respectively to the plurality of ROIs.
These and/or other aspects will become apparent and more readily appreciated from the following description of the exemplary embodiments, taken in conjunction with the accompanying drawings in which:
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to like elements throughout. In this regard, the present exemplary embodiments may have different forms and should not be construed as being limited to the descriptions set forth herein. Accordingly, the exemplary embodiments are merely described below, by referring to the figures, to explain aspects of the present description. As used herein, expressions such as “at least one of,” when preceding a list of elements, modify the entire list of elements and do not modify the individual elements of the list.
The terms used in this specification are those general terms currently widely used in the art in consideration of functions in regard to the present invention, but the terms may vary according to the intention of those of ordinary skill in the art, precedents, or new technology in the art. Also, specified terms may be selected by the applicant, and in this case, the detailed meaning thereof will be described in the detailed description of the inventive concept. Thus, the terms used in the specification should be understood not as simple names but based on the meaning of the terms and the overall description of the inventive concept.
When something “comprises” or “includes” a component, another component may be further included unless specified otherwise. Also, terms such as “ . . . unit”, “ . . . module”, or the like refer to units that perform at least one function or operation, and the units may be implemented as hardware or software or as a combination of hardware and software.
Throughout the specification, an “ultrasound image” refers to an image of an object obtained by using an ultrasound signal. The object may refer to a part of a body. For example, the object may include organs such as liver, heart, nuchal translucency (NT), brain, breast, and abdomen, or embryo.
Throughout the specification, a “user” may be, but is not limited to, a medical expert including a doctor, a nurse, a medical laboratory technologist, or a sonographer.
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. The inventive concept may, however, be embodied in many different forms and should not be construed as being limited to the exemplary embodiments set forth herein. In addition, portions irrelevant to the description of the exemplary embodiments will be omitted in the drawings for a clear description of the exemplary embodiments, and like reference numerals will denote like elements throughout the specification.
As illustrated in
In this specification, the ultrasound image may be implemented variously. For example, the ultrasound image may include, but is not limited to, at least one of a brightness (B) mode image representing a magnitude of an ultrasound echo signal reflected from the object 10 by brightness, a color Doppler image representing a speed of a moving object by color by using the Doppler effect, a spectral Doppler image representing an image of a moving object in the form of a spectrum by using the Doppler effect, a motion (M) mode image representing a time-dependent motion of an object at a predetermined position, and an elasticity mode image representing a difference between a reaction of an object when compression is applied to the object and a reaction of the object when compression is not applied to the object by an image.
According to an exemplary embodiment, the ultrasound image may be a two-dimensional (2D) image, a three-dimensional (3D) image, or a four-dimensional (4D) image. The ultrasound image will be descried below in detail with reference to
As illustrated in
Thus, as illustrated in
According to an exemplary embodiment, the ultrasound apparatus 1000 may provide an ultrasound image as the height map 220, so that the user may intuitively detect a brightness difference in the ultrasound image by using a height difference.
As illustrated in
However, as illustrated in
Hereinafter, a method of displaying a portion of an ultrasound image as a 3D image having height values to enable the user to reduce an analysis error in the ultrasound image will be described in detail with reference to
Referring to
According to an exemplary embodiment, the ultrasound apparatus 1000 may directly generate an ultrasound image of the object or may receive an ultrasound image of the object from an outside thereof. For example, the ultrasound apparatus 1000 may transmit an ultrasound signal to the object and generate an ultrasound image by using an ultrasound response signal reflected from the object. Also, the ultrasound apparatus 1000 may receive an ultrasound image from an external server or an external device.
The ultrasound image may include, but is not limited to, at least one of a brightness (B) mode image, a color Doppler image, a spectral Doppler image, a tissue Doppler image, an elasticity image, and a motion (M) mode image.
In operation S320, the ultrasound apparatus 1000 may select at least one region of interest (ROI) in the ultrasound image based on a user input. For example, the ultrasound apparatus 1000 may receive a user input for selecting an ROI in the ultrasound image.
According to an exemplary embodiment, various user inputs may be used to select the ROI. For example, the user input may include, but is not limited to, at least one of a key input, a touch input (e.g., a tap, a double tap, a touch & drag, a flick, or a swipe), a voice input, a motion input, and a multiple input.
According to an exemplary embodiment, the ROI may have various shapes. For example, the ROI may have, but is not limited to, a circular shape, an elliptical shape, a tetragonal shape, or a free curve shape. Also, the ROI may have various colors and various patterns.
According to an exemplary embodiment, the ultrasound apparatus 1000 may semi-automatically select the ROI. For example, the ultrasound apparatus 1000 may receive a selection of a specific point from the user. The ultrasound apparatus 1000 may select an ROI of a predetermined size (e.g., 10 pixels or 5 cm2) based on a point selected by the user. The predetermined size may be preset by the user or by the ultrasound apparatus 1000.
Also, the ultrasound apparatus 1000 may select a region, which has a predetermined similarity value or more with respect to pattern information of a point selected by the user, in the ultrasound image as the ROI. For example, the ultrasound apparatus 1000 may select a region, which has a similar pattern value to a point selected by the user, as the ROI by using the following equations (Equations 1 to 3) that are used to analyze texture characteristics such as a gray level co-occurrence matrix (GLCM), entropy, and mutual information.
A GLCM method calculates the relationship between a brightness value of a current pixel and a brightness value of an adjacent pixel as a basic statistic such as an average, a contrast, or a correlation, allocates the calculation value as a new brightness value to a center pixel in a tunnel, and represents a partial texture characteristic of an input image. Those of ordinary skill in the art will readily understand the following equations, and thus detailed descriptions thereof will be omitted.
According to an exemplary embodiment, the ultrasound apparatus 1000 may receive a selection of at least one ROI through a graphical user interface (GUI). This will be described later in detail with reference to
In operation S330, the ultrasound apparatus 1000 may convert image pixel information corresponding to at least one ROI into height values. According to an exemplary embodiment, the image pixel information, which represent values of pixels displayed on a screen, may include, but is not limited to, at least one of a brightness value, a speed value, a color value, an elasticity value, an amplitude value of a sound reflection signal, and a sound impedance value.
According to an exemplary embodiment, the ultrasound apparatus 1000 may convert brightness values included in the ROI into height values by using a first mapping table representing a correlation between brightness values and height values. In this case, the height values may increase as the brightness values increases; however, embodiments are not limited thereto.
Also, the ultrasound apparatus 1000 may convert color values included in the ROI into height values by using a second mapping table representing a correlation between a color value and a height value. Also, the ultrasound apparatus 1000 may convert speed values or elasticity values included in the ROI into height values.
According to an exemplary embodiment, the ultrasound apparatus 1000 may determine a sign of the height values based on movement direction information of a tissue or a bloodstream when the ultrasound image is a color Doppler image or a spectral Doppler image. For example, when the movement direction or a tissue or a bloodstream is a first direction, the ultrasound apparatus 1000 may determine the height values as a positive value; and when the movement direction or a tissue or a bloodstream is a second direction, the ultrasound apparatus 1000 may determine the height values as a negative value.
The ultrasound apparatus 1000 may also determine a sign (e.g., a plus sign or a minus sign) of the height values with respect to other directional image data in addition to Doppler image data (e.g., color Doppler image data or spectral Doppler image data).
In operation S340, the ultrasound apparatus 1000 may three-dimensionally display a partial ultrasound image corresponding to at least one ROI by using the height values.
According to an exemplary embodiment, the ultrasound apparatus 1000 may three-dimensionally display a partial ultrasound image in an ROI of the ultrasound image. For example, the ultrasound apparatus 1000 may display a 3D partial ultrasound image overlappingly on the ultrasound image. In this case, the ultrasound apparatus 1000 may three-dimensionally display a partial ultrasound image corresponding to at least one ROI in the ultrasound image and may two-dimensionally display the other images.
According to another exemplary embodiment, the ultrasound apparatus 1000 may three-dimensionally display a partial ultrasound image in a second region different from a first region in which the ultrasound image is displayed.
According to an exemplary embodiment, the ultrasound apparatus 1000 may provide additional information related to the partial ultrasound image. For example, the ultrasound apparatus 1000 may display at least one of an average height value, a maximum height value, a minimum height value, and a variance value of the partial ultrasound image.
According to an exemplary embodiment, the ultrasound apparatus 1000 may display the 3D partial ultrasound image as a gray scale map or a color scale map. For example, the ultrasound apparatus 1000 may display the 3D partial ultrasound image such that a higher portion is brighter and a lower portion is darker. Also, the ultrasound apparatus 1000 may display the 3D partial ultrasound image such that a higher portion is redder and a lower portion is bluer.
According to an exemplary embodiment, the ultrasound apparatus 1000 may display the 3D partial ultrasound image in at least one form of a contour line map, a contour surface map, and a numerical value map. For example, the ultrasound apparatus 1000 may display a contour line, a contour surface, or a numerical value in the 3D partial ultrasound image.
According to an exemplary embodiment, the ultrasound apparatus 1000 may transparently or semi-transparently display the other images except an ultrasound image corresponding to at least one ROI among the entire ultrasound image displayed based on the ultrasound image data.
According to an exemplary embodiment, the ultrasound apparatus 1000 may three-dimensionally display a partial ultrasound image corresponding to at least one ROI by using a light source-based rendering method. The light source-based rendering method (e.g., ray tracing) refers to a rendering method that traces a virtual visible ray to determine a color of a pixel. For example, the light source-based rendering method may process an object surface brightness, a light reflection effect, and a light refraction effect in the relationship between light and an object.
According to an exemplary embodiment, the ultrasound apparatus 1000 may receive a selection of at least one rendering method among a plurality of rendering methods and three-dimensionally display the partial ultrasound image corresponding to at least one ROI by the selected rendering method. The rendering method described herein may include, but is not limited to, a shading-based rendering method, a light source-based rendering (e.g., ray tracing) method, a radiosity-based rendering method, a volume rendering method, an image-based rendering (IBR) method, and a non-photorealistic rendering (NPR) method. Hereinafter, for convenience of description, the shading-based rendering method and the light source-based rendering method (e.g., ray tracing) will be described as examples. The shading-based rendering method calculates the brightness of an object based on the properties of light.
According to an exemplary embodiment, when the user selects a plurality of ROIs, the ultrasound apparatus 1000 may three-dimensionally display partial ultrasound images corresponding respectively to the plurality of ROIs. For example, the ultrasound apparatus 1000 may display a first partial ultrasound image corresponding to a first ROI and a second partial ultrasound image corresponding to a second ROI as a height map.
In this case, according to an exemplary embodiment, the ultrasound apparatus 1000 may display the 3D first partial ultrasound image and the 3D second partial ultrasound image overlappingly on the ultrasound image, or may display the 3D first partial ultrasound image and the 3D second partial ultrasound image separately from the ultrasound image.
According to an exemplary embodiment, when the first ROI and the second ROI have different sizes, the first partial ultrasound image and the second partial ultrasound image may have different sizes. In this case, the ultrasound apparatus 1000 may adjust the size of at least one of the first partial ultrasound image and the second partial ultrasound image before displaying the first partial ultrasound image and the second partial ultrasound image overlapping on the ultrasound image.
According to an exemplary embodiment, the ultrasound apparatus 1000 may adjust the size of the first partial ultrasound image and the size of the second partial ultrasound image to a predetermined size and display the size-adjusted first and second partial ultrasound images overlappingly on the ultrasound image. For example, the ultrasound apparatus 1000 may adjust the size of the first partial ultrasound image and the size of the second partial ultrasound image to the same size (e.g., 3×4 pixels) and display the same-size first and second partial ultrasound images overlappingly on the ultrasound image. Also, the ultrasound apparatus 1000 may adjust the size of the first partial ultrasound image to the size of the second partial ultrasound image and display the resulting first and second partial ultrasound images overlappingly on the ultrasound image.
As illustrated in
When the user selects the template button 411, the ultrasound apparatus 1000 may provide a template list 420. A template may be a figure used to select the ROI 400. The template list 420 may include, but is not limited to, a circle, a cone, an ellipse, a tetragon, a pentagon, a triangle, and a free curve. For example, as illustrated in
According to an exemplary embodiment, when the user selects a circle from the template list 420, the ultrasound apparatus 1000 may display a circle on an ultrasound image. In this case, the user may select the ROI 400 by changing the position or size of the circle displayed on the ultrasound image.
According to another exemplary embodiment, the user may select the ROI 400 by drawing a line on the ultrasound image by using a touch tool (e.g., a finger or an electronic pen), a mouse, or a track ball.
When the user selects the pointer button 412 and selects a first point in the ultrasound image, the ultrasound apparatus 1000 may select an ROI of a predetermined size (e.g., a circle having a radius of about 2 cm, an ellipse having an about 2 cm major axis and an about 1 cm minor axis, or a tetragon having an area of about 4 cm2) based on the first point.
Also, when the user selects the magic button 413, the ultrasound apparatus 1000 may display a magic icon for selecting a desired point. When the user moves the magic icon and selects a second point in the ultrasound image, the ultrasound apparatus 1000 may select a region having a pattern, which is similar to a pattern of the second point, as the ROI 400. For example, the ultrasound apparatus 1000 may select a region having a color value, which is equal or similar to a color value of the second point, as the ROI 400. Also, when the user selects a desired point in a tumor, the ultrasound apparatus 1000 may automatically select a tumor portion having similar pattern information (e.g., similar brightness) to the desired point, as the ROI 400.
According to an exemplary embodiment, the ultrasound apparatus 1000 may also provide a color list 430 for enabling the user to add a color to the ROI 400. In this case, the user (e.g., a sonographer) may add a color to the ROI 400 to increase the discrimination of the ROI 400.
As illustrated in
In this case, the ultrasound apparatus 1000 may convert brightness values included in the ROI 510 into height values. The ultrasound apparatus 1000 may generate a 3D partial ultrasound image 520 of the ROI 510 by using the height values.
As illustrated in
As illustrated in
As illustrated in
According to an exemplary embodiment, the ultrasound apparatus 1000 may display the 3D partial ultrasound image 620 in a separate region from the ultrasound image 600. Also, according to an exemplary embodiment, the ultrasound apparatus 1000 may display the 3D partial ultrasound image 620 with an increased or decreased size than the ROI 610.
Also, according to an exemplary embodiment, the ultrasound apparatus 1000 may display additional information 630 related to the 3D partial ultrasound image 620. For example, the ultrasound apparatus 1000 may display an average height value of 141, a maximum height value of 150, and a minimum height value of 125 as the additional information 630.
According to an exemplary embodiment, the ultrasound apparatus 1000 may move the position of the 3D partial ultrasound image 620, adjust the size of the 3D partial ultrasound image 620, or rotate the 3D partial ultrasound image 620 is a predetermined direction. Also, the ultrasound apparatus 1000 may display the 3D partial ultrasound image 620 as a gray scale map or a color scale map, and may display a contour line, a contour surface, and a numerical value in the 3D partial ultrasound image 620.
As illustrated in
According to an exemplary embodiment, the ultrasound apparatus 1000 may select a red region as an ROI 710 based on a user input. For example, when the user touches a red region with a magic icon 711, the ultrasound apparatus 1000 may select the red region as the ROI 710.
As illustrated in
As illustrated in
As illustrated in
While
As illustrated in
According to an exemplary embodiment, the user may select the liver parenchyma as an ROI 900.
As illustrated in
Also, as illustrated in
As illustrated in
According to an exemplary embodiment, the ultrasound apparatus 1000 may display the 3D first partial ultrasound image in the first ROI 1010 and display the 3D second partial ultrasound image in the second ROI 1020.
As illustrated in
According to an exemplary embodiment, the ultrasound apparatus 1000 may display the 3D first partial ultrasound image 1103 and the 3D second partial ultrasound image 1104 separately from the abdominal ultrasound image 1105. In this case, the ultrasound apparatus 1000 may display the 3D first partial ultrasound image 1103 and the 3D second partial ultrasound image 1104 in a side view.
According to an exemplary embodiment, the ultrasound apparatus 1000 may adjust at least one of the position, the rotation angle, and the size of the 3D first partial ultrasound image 1103 and the 3D second partial ultrasound image 1104. Also, the ultrasound apparatus 1000 may display the 3D first partial ultrasound image 1103 and the 3D second partial ultrasound image 1104 overlappingly on the abdominal ultrasound image 1105. In this case, the ultrasound apparatus 1000 may adjust the 3D first partial ultrasound image 1103 and the 3D second partial ultrasound image 1104 to the same size and display the same-size 3D first an second partial ultrasound images 1103 and 1104 overlappingly on the abdominal ultrasound image 1105.
Also, the ultrasound apparatus 1000 may three-dimensionally display a difference image between the 3D first partial ultrasound image 1103 and the 3D second partial ultrasound image 1104.
As illustrated in
As illustrated in
As illustrated in
As illustrated in
As illustrated in
As illustrated in
The above components will be described below.
The ultrasound image acquiring unit 1100 according to an exemplary embodiment may acquire ultrasound image data of an object 10. The ultrasound image data according to an exemplary embodiment may be 2D ultrasound image data or 3D ultrasound image data of the object 10.
According to an exemplary embodiment, the ultrasound image acquiring unit 1100 may include a probe 20, an ultrasound transmission/reception unit 1110, and an image processing unit 1120.
The probe 20 transmits an ultrasound signal to the object 10 according to a driving signal applied from the ultrasound transmission/reception unit 1110 and receives an echo signal reflected from the object 10. The probe 20 includes a plurality of transducers, and the plurality of transducers oscillate according to an electrical signal transmitted thereto and generate an ultrasound wave, that is, acoustic energy. Also, the probe 20 may be connected to a main body of the ultrasound apparatus 1000 wiredly or wirelessly. According to embodiments, the ultrasound apparatus 1000 may include a plurality of probes 20. According to an exemplary embodiment, the probe 20 may include at least one of a one-dimensional (1D) probe, a 1.5D probe, a 2D (matrix) probe, and a 3D probe.
A transmission unit 1111 supplies a driving signal to the probe 20 and includes a pulse generating unit 1113, a transmission delaying unit 1114, and a pulser 1115. The pulse generating unit 1113 generates pulses for forming transmission ultrasound waves according to a predetermined pulse repetition frequency (PRF), and the transmission delaying unit 1114 applies a delay time for determining transmission directionality to the pulses. The pulses to which a delay time is applied correspond to a plurality of piezoelectric vibrators included in the probe 20, respectively. The pulser 1115 applies a driving signal (or a driving pulse) to the probe 20 at a timing corresponding to each pulse to which a delay time is applied.
A reception unit 1112 generates ultrasound data by processing echo signals received from the probe 20 and may include an amplifier 1116, an analog-digital converter (ADC) 1117, a reception delaying unit 1118, and a summing unit 1119. The amplifier 1116 amplifies echo signals in each channel, and the ADC 1117 analog-digital converts the amplified echo signals. The reception delaying unit 1118 applies delay times for determining reception directionality to the digital-converted echo signals, and the summing unit 1119 generates ultrasound image data by summing the echo signals processed by the reception delaying unit 1118.
The image processing unit 1120 generates an ultrasound image by scan-converting ultrasound image data generated by the ultrasound transmission/reception unit 1110. The ultrasound image may include not only a gray-scale ultrasound image obtained by scanning the object according to an amplitude (A) mode, a brightness (B) mode, and a motion (M) mode, but also a Doppler image representing a motion of a moving object by using a Doppler effect. The Doppler image may include a bloodstream Doppler image (also referred to as a color Doppler image) representing a flow of blood, a tissue Doppler image representing a motion of a tissue, and a spectral Doppler image representing a movement speed of an object in a waveform.
A B mode processing unit 1123 extracts B mode components from ultrasound image data and processes the B mode components. An image generating unit 1122 may generate an ultrasound image representing signal intensities as brightness based on the B mode components extracted by the B mode processing unit 1123.
Likewise, a Doppler processing unit 1124 may extract Doppler components from ultrasound image data, and the image generating unit 1122 may generate a Doppler image representing a motion of an object as colors or waveforms based on the extracted Doppler components.
The image generating unit 1122 according to an exemplary embodiment may generate a 3D ultrasound image through volume-rendering of volume data and may also generate an elasticity image that visualizes deformation of the object 10 due to a pressure.
In addition, the image generating unit 1122 may display various additional information in an ultrasound image by texts or graphics. For example, the image generating unit 1122 may add at least one annotation related to all or a portion of the ultrasound image to the ultrasound image. That is, the image generating unit 1122 may analyze the ultrasound image and add at least one annotation related to all or a portion of the ultrasound image to the ultrasound image based on the analysis result. Also, the image generating unit 1122 may add additional information corresponding to an ROI selected by the user to the ultrasound image.
The image processing unit 1120 may extract an ROI from the ultrasound image by using an image processing algorithm. In this case, the image processing unit 1120 may add a color, a pattern, or a frame to the ROI.
The image processing unit 1120 may convert image pixel information (e.g., a brightness value, a color value, a speed value, and an elasticity value) acquired from the ultrasound image data into height values. In this case, the image processing unit 1120 may convert image pixel information included in the ROI selected by the user into height values.
The user input unit 1200 refers to a unit through which the user (e.g., sonographer) inputs data for controlling the ultrasound apparatus 1000. For example, the user input unit 1200 may include, but is not limited to, a keypad, a dome switch, a touch pad (e.g., a capacitive overlay type, a resistive overlay type, an infrared beam type, a surface acoustic wave type, an integral strain gauge type, or a piezoelectric type), a track ball, and a jog switch. For example, the user input unit 1200 may further include various input units such as an electrocardiogram measuring module, a breath measuring module, a voice recognition sensor, a gesture recognition sensor, a fingerprint recognition sensor, an iris recognition sensor, a depth sensor, and a distance sensor.
According to an exemplary embodiment, the user input unit 1200 may detect not only a real touch but also a proximity touch. The user input unit 1200 may detect a touch input (e.g., a touch & hold, a tap, a double tap, or a flick) to the ultrasound image. Also, the user input unit 1200 may detect a drag input from a point at which a touch input is detected. The user input unit 1200 may detect a multi-touch input (e.g., a pinch) to at least two points included in the ultrasound image.
The user input unit 1200 may receive a selection of at least one ROI in the ultrasound image. In this case, various inputs may be used to select an ROI. Also, the ROI may have various shapes.
The user input unit 1200 may receive a selection of at least one rendering method among a plurality of rendering methods.
The controller 1300 may control overall operations of the ultrasound apparatus 1000. For example, the controller 1300 may control overall operations of the ultrasound image acquiring unit 1100, the user input unit 1200, the communication unit 1400, the memory 1500, and the display 1600.
The controller 1300 may convert image pixel information corresponding to at least one ROI into height values. The controller 1300 may determine a sign of the height values based on movement direction information of a tissue or a bloodstream when the ultrasound image is a color Doppler image or a spectral Doppler image.
The controller 1300 may control the display 1600 to three-dimensionally display a partial ultrasound image corresponding to at least one ROI by using the height values. The controller 1300 may apply a semitransparent effect or a transparent effect to the other images except the partial ultrasound image among the ultrasound image.
The controller 1300 may select an ROI of a predetermined size based on a point selected by the user. The controller 1300 may select a region, which has a predetermined similarity value or more with respect to pattern information of a point selected by the user, in the ultrasound image as the ROI.
The controller 1300 may adjust at least one of a position, a rotation angle, and a size of the 3D partial ultrasound image based on a user input.
The communication unit 1400 may include one or more components for enabling communication between the ultrasound apparatus 1000 and a server 2000, between the ultrasound apparatus 1000 and a first device 3000, and between the ultrasound apparatus 1000 and a second device 400. For example, the communication unit 1400 may include a short-range communication module 1410, a wired communication module 1420, and a mobile communication module 1430.
The short-range communication module 1410 refers to a module for short-range communication within a predetermined distance. Short-range communication technologies may include Wireless Local Area Network (LAN), Wi-Fi, Bluetooth, Bluetooth Low Energy (BLE), Ultra Wideband (UWB), Zigbee, Near Field Communication (NFC), Wi-Fi Direct (WFD), and Infrared Data Association (IrDA).
The wired communication module 1420 refers to a module for communication using electrical signals or optical signals. Examples of wired communication techniques according to an exemplary embodiment may include a pair cable, a coaxial cable, an optical fiber cable, and an Ethernet cable.
The mobile communication module 1430 communicates wireless signals with at least one of a base station, external devices (e.g., the first and second devices 3000 and 4000), and the server 2000 on a mobile communication network. Herein, the wireless signals may include voice call signals, video call signals, or various types of data for transmission and reception of text/multimedia messages.
The communication unit 100 is wiredly or wirelessly connected to a network 30 to communicate with an external device (e.g., the first device 3000 or the second device 4000) or the server 2000. The communication unit 1400 may exchange data with a hospital server or other medical apparatuses in a hospital connected through a Picture Archiving and Communication System (PACS). Also, the communication unit 1400 may perform data communication according to the Digital Imaging and Communications in Medicine (DICOM) standard.
The communication unit 1400 may transmit and receive data related to a diagnosis of the object 10, such as an ultrasound image, ultrasound image data, and Doppler image data of the object 10, through the network 30 and may also transmit and receive medical images captured by other medical devices, such as a CT image, a MRI image, and an X-ray image. In addition, the communication unit 1400 may receive information related to a diagnosis history or a treatment schedule of a patient from the server 2000 and utilize the information to diagnose the object 10.
The memory 1500 may store a program for processing of the controller 1300 and may store input/output data (e.g., ultrasound image data, a mapping table of a brightness value and a height value, a mapping table of a speed value and a height value, a mapping table of a color value and a height value, a mapping table of an elasticity value and a height value, information about an ROI, ultrasound images, testee information, probe information, body markers, and additional information).
The memory 1500 may include at least one type of storage medium from among flash memory type, hard disk type, multimedia card micro type, card type memory (e.g., SD and XD memories), random access memory (RAM), static random access memory (SRAM), read-only memory (ROM), electronically erasable programmable read-only memory (EEPROM), programmable read-only memory (PROM), magnetic memory, magnetic disk, and optical disk. Also, the ultrasound apparatus 1000 may utilize a web storage or a cloud server that functions as the memory 1500 on the Internet.
The display 1600 may display information processed in the ultrasound apparatus 1000. For example, the display 1600 may display an ultrasound image or may display a user interface (UI) or a graphical user interface (GUI) related to a control panel.
The display 1600 may display a partial region of the ultrasound image three-dimensionally and display the other regions two-dimensionally. For example, the display 1600 may display a partial ultrasound image corresponding to an ROI as a height map.
The display 1600 may three-dimensionally display a partial ultrasound image in the ROI of the ultrasound image. The display 1600 may three-dimensionally display a partial ultrasound image in a region different from a region in which the ultrasound image is displayed.
The display 1600 may three-dimensionally display a partial ultrasound image by using a light source-base rendering method. The display 1600 may three-dimensionally display a partial ultrasound image by a rendering method that is selected among a plurality of rendering methods by the user.
The display 1600 may display a 3D partial ultrasound image in at least one form of a gray scale map, a color scale map, a contour line map, a contour surface map, and a numerical value map.
The display 1600 may further display at least one of an average height value, a maximum height value, and a minimum height value of the partial ultrasound image. The display 1600 may three-dimensionally display a first partial ultrasound image corresponding to a first ROI and a second partial ultrasound image corresponding to a second ROI. In this case, the display 1600 may display the first partial ultrasound image and the second partial ultrasound image in a side view.
The display 1600 may display the 3D first partial ultrasound image and the 3D second partial ultrasound image overlappingly on the ultrasound image. In this case, the display 1600 may display the 3D first partial ultrasound image and the 3D second partial ultrasound image, which are adjusted to the same size by the controller 1300, overlappingly on the ultrasound image. The display 1600 may three-dimensionally display a difference image between the first partial ultrasound image and the second partial ultrasound image.
When the display 1600 includes a touchscreen with a layer structure of a touch pad, the display 1600 may be used as an input device in addition to an output device. The display 1600 may include at least one of a liquid crystal display, a thin film transistor liquid crystal display, an organic light-emitting diode, a flexible display, a 3D display, and an electrophoretic display. Also, the ultrasound apparatus 1000 may include two or more displays 1600 according to embodiments.
The methods according to the exemplary embodiments may be embodied in the form of program commands executable through various computer means, which may be recorded on a computer-readable recording medium. The computer-readable recording medium may include program commands, data files, and data structures either alone or in combination. The program commands recorded on the computer-readable recording medium may be those that are especially designed and configured for the inventive concept, or may be those that are known and available to computer programmers skilled in the art. Examples of the computer-readable recording medium include magnetic recording media such as hard disks, floppy disks and magnetic tapes, optical recording media such as CD-ROMs and DVDs, magneto-optical recording media such as floptical disks, and hardware devices such as ROMs, RAMs and flash memories that are especially configured to store and execute program commands. Examples of the program commands include machine language codes that may be generated by a compiler, and high-level language codes that may be executed by a computer by using an interpreter.
As described above, according to the one or more of the above exemplary embodiments, the ultrasound apparatus 1000 provides a 3D height map about an ROI of the user, thereby improving the convenience of an ultrasound diagnosis of the user. Also, an operation speed may be increased as compared with the case of providing an overall height map, and a height image of an ROI may be prevented from being covered by a height component of a region of uninterest (ROU).
It should be understood that the exemplary embodiments described herein should be considered in a descriptive sense only and not for purposes of limitation. Descriptions of features or aspects within each exemplary embodiment should typically be considered as available for other similar features or aspects in other exemplary embodiments.
While one or more exemplary embodiments have been described with reference to the figures, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the scope of the inventive concept as defined by the following claims.
Patent | Priority | Assignee | Title |
10453193, | May 05 2017 | General Electric Company | Methods and system for shading a two-dimensional ultrasound image |
10489969, | Nov 08 2017 | General Electric Company | Method and system for presenting shaded descriptors corresponding with shaded ultrasound images |
11096667, | Nov 17 2016 | SAMSUNG MEDISON CO , LTD | Ultrasound imaging apparatus and method of controlling the same |
Patent | Priority | Assignee | Title |
6464642, | Aug 20 1999 | Toshiba Medical Systems Corporation | Ultrasonic diagnosis apparatus |
7678048, | Sep 14 1999 | Siemens Medical Solutions USA, Inc | Medical diagnostic ultrasound system and method |
20060241458, | |||
20080051661, | |||
20130060540, | |||
20130184578, | |||
EP770352, | |||
EP2705798, | |||
JP11327, | |||
JP2006350724, | |||
JP2008080106, | |||
KRP2705798, | |||
WO2007058632, | |||
WO2009079769, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Nov 26 2014 | OH, DONG-HOON | SAMSUNG MEDISON CO , LTD | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 034380 | /0627 | |
Nov 26 2014 | HYUN, DONG-GYU | SAMSUNG MEDISON CO , LTD | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 034380 | /0627 | |
Dec 04 2014 | Samsung Medison Co., Ltd. | (assignment on the face of the patent) | / |
Date | Maintenance Fee Events |
Jan 19 2017 | ASPN: Payor Number Assigned. |
Apr 02 2020 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Apr 01 2024 | M1552: Payment of Maintenance Fee, 8th Year, Large Entity. |
Date | Maintenance Schedule |
Nov 01 2019 | 4 years fee payment window open |
May 01 2020 | 6 months grace period start (w surcharge) |
Nov 01 2020 | patent expiry (for year 4) |
Nov 01 2022 | 2 years to revive unintentionally abandoned end. (for year 4) |
Nov 01 2023 | 8 years fee payment window open |
May 01 2024 | 6 months grace period start (w surcharge) |
Nov 01 2024 | patent expiry (for year 8) |
Nov 01 2026 | 2 years to revive unintentionally abandoned end. (for year 8) |
Nov 01 2027 | 12 years fee payment window open |
May 01 2028 | 6 months grace period start (w surcharge) |
Nov 01 2028 | patent expiry (for year 12) |
Nov 01 2030 | 2 years to revive unintentionally abandoned end. (for year 12) |