US9510803B2 - Providing compound image of doppler spectrum images in ultrasound system - Google Patents

Providing compound image of doppler spectrum images in ultrasound system Download PDF

Info

Publication number
US9510803B2
US9510803B2 US13/730,033 US201213730033A US9510803B2 US 9510803 B2 US9510803 B2 US 9510803B2 US 201213730033 A US201213730033 A US 201213730033A US 9510803 B2 US9510803 B2 US 9510803B2
Authority
US
United States
Prior art keywords
pixels
ultrasound
data
doppler spectrum
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US13/730,033
Other versions
US20130172754A1 (en
Inventor
Hyoung Jin Kim
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Medison Co Ltd
Original Assignee
Samsung Medison Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Medison Co Ltd filed Critical Samsung Medison Co Ltd
Assigned to SAMSUNG MEDISON CO., LTD. reassignment SAMSUNG MEDISON CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KIM, HYOUNG JIN
Publication of US20130172754A1 publication Critical patent/US20130172754A1/en
Application granted granted Critical
Publication of US9510803B2 publication Critical patent/US9510803B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/13Tomography
    • A61B8/14Echo-tomography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/06Measuring blood flow
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/488Diagnostic techniques involving Doppler signals
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5238Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
    • A61B8/5246Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from the same or different imaging techniques, e.g. color Doppler and B-mode
    • A61B8/5253Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from the same or different imaging techniques, e.g. color Doppler and B-mode combining overlapping images, e.g. spatial compounding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects

Definitions

  • the present disclosure generally relates to ultrasound systems, and more particularly to providing a compound image of Doppler spectrum images in an ultrasound system.
  • An ultrasound system has become an important and popular diagnostic tool since it has a wide range of applications. Specifically, due to its non-invasive and non-destructive nature, the ultrasound system has been extensively used in the medical profession. Modern high-performance ultrasound systems and techniques are commonly used to produce two-dimensional or three-dimensional ultrasound images of internal features of target objects (e.g., human organs).
  • target objects e.g., human organs
  • the ultrasound system may provide ultrasound images of various modes including a brightness mode image representing reflection coefficients of ultrasound signals (i.e., ultrasound echo signals) reflected from a target object of a living body with a two-dimensional image, a Doppler mode image representing velocity of a moving target object with spectral Doppler by using a Doppler effect, a color Doppler mode image representing velocity of the moving target object with colors by using the Doppler effect, an elastic image representing mechanical characteristics of tissues before and after applying compression thereto, and the like.
  • a brightness mode image representing reflection coefficients of ultrasound signals (i.e., ultrasound echo signals) reflected from a target object of a living body with a two-dimensional image
  • a Doppler mode image representing velocity of a moving target object with spectral Doppler by using a Doppler effect
  • a color Doppler mode image representing velocity of the moving target object with colors by using the Doppler effect
  • an elastic image representing mechanical characteristics of tissues before and after applying compression thereto, and the like.
  • the ultrasound system may transmit the ultrasound signals to the living body and receive the ultrasound echo signals from the living body to form Doppler signals corresponding to a region of interest, which is set on the brightness mode image.
  • the ultrasound system may further form the color Doppler mode image representing the velocity of the moving target object with colors based on the Doppler signals.
  • the color Doppler image may represent the motion of the target object (e.g., blood flow) with the colors.
  • the color Doppler image may be used to diagnose disease of a blood vessel, a heart and the like.
  • the target object e.g., blood flow
  • the respective colors indicated by a motion value is a function of the velocity of the target object, which moves forward in a transmission direction of the ultrasound signals and moves backward in the transmission direction of the ultrasound signals.
  • the ultrasound system may set a sample volume on the brightness mode image, transmit ultrasound signals to the living body based on an ensemble number, and receive ultrasound echo signals from the living to form a Doppler spectrum image corresponding to the sample volume.
  • Doppler spectrum images corresponding to at least two sample volumes and compounding the Doppler spectrum images to provide a compound image.
  • an ultrasound system comprises: a processing unit configured to form at least two Doppler spectrum images corresponding to at least two sample volumes based on ultrasound data corresponding to the at least two sample volume, the processing unit being further configured to perform an image process for forming a compound image upon the at least two Doppler spectrum images to form the compound image.
  • a method of providing a compound image comprising: a) forming at least two Doppler spectrum images corresponding to at least two sample volumes based on ultrasound data corresponding to the at least two sample volume; and b) performing an image process for forming a compound image upon the at least two Doppler spectrum images to form the compound image.
  • FIG. 1 is a block diagram showing an illustrative embodiment of an ultrasound system.
  • FIG. 2 is a schematic diagram showing an example of a brightness mode image and sample volumes.
  • FIG. 3 is a block diagram showing an illustrative embodiment of an ultrasound data acquiring unit.
  • FIG. 4 is a schematic diagram showing an example of sampling data and pixels of an ultrasound image.
  • FIGS. 5 to 8 are schematic diagrams showing examples of performing a reception beam-forming.
  • FIG. 9 is a schematic diagram showing an example of setting weights.
  • FIG. 10 is a schematic diagram showing an example of setting a sampling data set.
  • FIG. 11 is a flow chart showing a process of forming a compound image of Doppler spectrum images.
  • FIG. 12 is a schematic diagram showing an example of the sample volumes and the Doppler spectrum images.
  • FIGS. 13 to 15 are schematic diagrams showing examples of the compound images.
  • the ultrasound system 100 may include a user input unit 110 .
  • the user input unit 110 may be configured to receive input information from a user.
  • the input information may include information for setting at least two sample volumes (e.g., SV 1 , SV 2 , SV 3 ) on a brightness mode image BI, as shown in FIG. 2 . That is, the input information may include the number, position and size of the sample volumes. However, it should be noted herein that the input information may not be limited thereto.
  • the reference numeral BV represents a blood vessel.
  • the user input unit 110 may include a control panel, a track ball, a touch screen, a mouse, a keyboard and the like.
  • the ultrasound system 100 may further include an ultrasound data acquiring unit 120 .
  • the ultrasound data acquiring unit 120 may be configured to transmit ultrasound signals to a living body.
  • the living body may include target objects (e.g., blood vessel, heart, blood flow, etc).
  • the ultrasound data acquiring unit 120 may be further configured to receive ultrasound signals (i.e., ultrasound echo signals) from the living body to acquire ultrasound data corresponding to an ultrasound image.
  • FIG. 3 is a block diagram showing an illustrative embodiment of the ultrasound data acquiring unit.
  • the ultrasound data acquiring unit 120 may include an ultrasound probe 310 .
  • the ultrasound probe 310 may include a plurality of elements (not shown) for reciprocally converting between ultrasound signals and electrical signals.
  • the ultrasound probe 310 may be configured to transmit the ultrasound signals to the living body.
  • the ultrasound probe 310 may be further configured to receive the ultrasound echo signals from the living body to output electrical signals (hereinafter referred to as “reception signals”).
  • the reception signals may be analog signals.
  • the ultrasound probe 310 may include a convex probe, a linear probe and the like.
  • the ultrasound data acquiring unit 120 may further include a transmitting section 320 .
  • the transmitting section 320 may be configured to control the transmission of the ultrasound signals.
  • the transmitting section 320 may be further configured to generate electrical signals (hereinafter referred to as “transmission signals”) in consideration of the elements.
  • the transmitting section 320 may be configured to generate transmission signals (hereinafter referred to as “brightness mode transmission signals”) for obtaining the brightness mode image BI in consideration of the elements.
  • the ultrasound probe 310 may be configured to convert the brightness mode transmission signals provided from the transmitting section 320 into the ultrasound signals, transmit the ultrasound signals to the living body, and receive the ultrasound echo signals from the living body to output reception signals (hereinafter referred to as “brightness mode reception signals”).
  • the transmitting section 320 may be further configured to generate transmission signals (hereinafter referred to as “Doppler mode transmission signals”) for obtaining Doppler spectrum images corresponding to the at least two sample volumes based on an ensemble number.
  • the ensemble number may represent the number of transmitting and receiving the ultrasound signals.
  • the ultrasound probe 310 may be configured to convert the Doppler mode transmission signals provided from the transmitting section 320 into the ultrasound signals and transmit the ultrasound signals to the living body.
  • the ultrasound signals transmitted from the ultrasound probe 310 may be the plane wave signals.
  • the ultrasound probe 310 may be further configured to receive the ultrasound echo signals from the living body in at least one reception direction to output reception signals (hereinafter referred to as “Doppler mode reception signals”).
  • the ultrasound data acquiring unit 120 may further include a receiving section 330 .
  • the receiving section 330 may be configured to perform an analog-digital conversion upon the reception signals provided from the ultrasound probe 310 to form sampling data.
  • the receiving section 330 may be also configured to perform a reception beam-forming upon the sampling data in consideration of the elements to form reception-focused data. The reception beam-forming will be described below in detail.
  • the receiving section 330 may be configured to perform the analog-digital conversion upon the brightness mode reception signals provided from the ultrasound probe 310 to form sampling data (hereinafter referred to as “brightness mode sampling data”).
  • the receiving section 330 may be further configured to perform the reception beam-forming upon the brightness mode sampling data to form reception-focused data (hereinafter referred to as “brightness mode reception-focused data”).
  • the receiving section 330 may be further configured to perform the analog-digital conversion upon the Doppler mode reception signals provided from the ultrasound probe 310 to form sampling data (hereinafter referred to as “Doppler mode sampling data”).
  • Doppler mode sampling data sampling data
  • the receiving section 330 may be also configured to perform the reception beam-forming upon the Doppler mode sampling data to form reception-focused data (hereinafter referred to as “Doppler mode reception-focused data”) corresponding to the at least two sample volumes.
  • the receiving section 330 may perform the reception beam-forming upon the Doppler mode sampling data to form first Doppler mode reception-focused data corresponding to the sample volume SV 1 .
  • the receiving section 330 may further perform the reception beam-forming upon the Doppler mode sampling data to form second Doppler mode reception-focused data corresponding to the sample volume SV 2 .
  • the receiving section 330 may also perform the reception beam-forming upon the Doppler mode sampling data to form third Doppler mode reception-focused data corresponding to the sample volume SV 3 .
  • reception beam-forming may be described with reference to the accompanying drawings.
  • the receiving section 330 may be configured to perform the analog-digital conversion upon the reception signals provided through a plurality of channels CH k , wherein 1 ⁇ k ⁇ N, from the ultrasound probe 310 to form sampling data S i,j , wherein the i and j are a positive integer, as shown in FIG. 4 .
  • the sampling data S i,j may be stored in a storage unit 140 .
  • the receiving section 330 may be further configured to detect pixels corresponding to the sampling data based on positions of the elements and positions (orientation) of pixels of the ultrasound image UI with respect to the elements.
  • the receiving section 330 may select the pixels, which the respective sampling data are used as pixel data thereof, during the reception beam-forming based on the positions of the elements and the orientation of the respective pixels of the ultrasound image UI with respect to the elements.
  • the receiving section 330 may be configured to cumulatively assign the sampling data corresponding to the selected pixels as the pixel data.
  • the receiving section 330 may be configured to set a curve (hereinafter referred to as “reception beam-forming curve”) CV 6,3 for selecting pixels, which the sampling data S 6,3 are used as the pixel data thereof, during the reception beam-forming based on the positions of the elements and the orientation of the respective pixels of the ultrasound image UI with respect to the elements, as shown in FIG. 5 .
  • the receiving section 330 may be further configured to detect the pixels P 3,1 , P 3,2 , P 4,2 , P 4,3 , P 4,4 , P 4,5 , P 4,6 , P 4,7 , P 4,8 , P 4,9 , . . .
  • the receiving section 330 may select the pixels P 3,1 , P 3,2 , P 4,2 , P 4,3 , P 4,4 , P 4,5 , P 4,6 , P 4,7 , P 4,8 , P 4,9 , . . . P 3,N on which the reception beam-forming curve CV 6,3 passes among the pixels P a,b of the ultrasound image UI.
  • the receiving section 330 may be also configured to assign the sampling data S 6,3 to the selected pixels P 3,1 , P 3,2 , P 4,2 , P 4,3 , P 4,4 , P 4,5 , P 4,6 , P 4,7 , P 4,8 , P 4,9 , . . . P 3,N , as shown in FIG. 6 .
  • the receiving section 330 may be configured to set a reception beam-forming curve CV 6,4 for selecting pixels, which the sampling data S 6,4 are used as the pixel data thereof, during the reception beam-forming based on the positions of the elements and the orientation of the respective pixels of the ultrasound image UI with respect to the elements, as shown in FIG. 7 .
  • the receiving section 330 may be further configured to detect the pixels P 2,1 , P 3,1 , P 3,2 , P 4,2 , P 4,3 , P 4,4 , P 5,4 , P 5,5 , P 5,6 , P 5,7 , P 5,8 , P 4,9 , P 5,9 , . . .
  • the receiving section 330 may select the pixels P 2,1 , P 3,1 , P 3,2 , P 4,2 , P 4,3 , P 4,4 , P 5,4 , P 5,5 , P 5,6 , P 5,7 , P 5,8 , P 4,9 , P 5,9 , . . . P 4,N , P 3,N on which the reception beam-forming curve CV 6,4 passes among the pixels P a,b of the ultrasound image UI.
  • the receiving section 330 may be further configured to assign the sampling data S 6,4 to the selected pixels P 2,1 , P 3,1 , P 3,2 , P 4,2 , P 4,3 , P 4,4 , P 5,4 , P 5,5 , P 5,6 , P 5,7 , P 5,8 , P 5,9 , . . . P 4,N , P 3,N , as shown in FIG. 8 .
  • the respective sampling data which are used as the pixel data, may be cumulatively assigned to the pixels as the pixel data.
  • the receiving section 330 may be configured to perform the reception beam-forming (i.e., summing) upon the sampling data, which are cumulatively assigned to the respective pixels P a,b of the ultrasound image UI to form the reception-focused data.
  • reception beam-forming i.e., summing
  • the receiving section 330 may be configured to perform the analog-digital conversion upon the reception signals provided through the plurality of channels CH k from the ultrasound probe 310 to form the sampling data S i,j , as shown in FIG. 4 .
  • the sampling data S i,j may be stored in the storage unit 140 .
  • the receiving section 330 may be further configured to detect pixels corresponding to the sampling data based on the positions of the elements and the position (orientation) of the pixels of the ultrasound image UI with respect to the elements. That is, the receiving section 330 may select the pixels, which the respective sampling data are used as the pixel data thereof, during the reception beam-forming based on the positions of the elements and the orientation of the respective pixels of the ultrasound image UI with respect to the elements.
  • the receiving section 330 may be configured to cumulatively assign the sampling data corresponding to the selected pixels as the pixel data.
  • the receiving section 330 may be further configured to determine pixels existing in the same column among the selected pixels.
  • the receiving section 330 may be also configured to set weights corresponding to the respective determined pixels.
  • the receiving section 330 may be additionally configured to apply the weights to the sampling data of the respective pixels.
  • the receiving section 330 may be configured to set the reception beam-forming curve CV 6,3 for selecting pixels, which the sampling data S 6,3 are used as the pixel data thereof, during the reception beam-forming based on the positions of the elements and the orientation of the respective pixels of the ultrasound image UI with respect to the elements, as shown in FIG. 5 .
  • the receiving section 330 may be further configured to detect the pixels P 3,1 , P 3,2 , P 4,2 , P 4,3 , P 4,4 , P 4,5 , P 4,6 , P 4,7 , P 4,8 , P 4,9 , . . .
  • the receiving section 330 may select the pixels P 3,1 , P 3,2 , P 4,2 , P 4,3 , P 4,4 , P 4,5 , P 4,6 , P 4,7 , P 4,8 , P 4,9 , . . . P IN on which the reception beam-forming curve CV 6,3 passes among the pixels P a,b of the ultrasound image UI.
  • the receiving section 330 may be also configured to assign the sampling data S 6,3 to the selected pixels P 3,1 , P 3,2 , P 4,2 , P 4,3 , P 4,4 , P 4,5 , P 4,6 , P 4,7 , P 4,8 , P 4,9 , . . . P 3,N , as shown in FIG. 6 .
  • the receiving section 330 may be further configured to determine pixels P 3,2 and P 4,2 , which exist in the same column among the selected pixels P 3,1 , P 3,2 , P 4,2 , P 4,3 , P 4,4 , P 4,5 , P 4,6 , P 4,7 , P 4,8 , P 4,9 , . . . P 3,N .
  • the receiving section 330 may be further configured to calculate a distance W 1 from a center of the determined pixel P 3,2 to the reception beam-forming curve CV 6,3 and a distance W 2 from a center of the determined pixel P 4,2 to the reception beam-forming curve CV 6,3 , as shown in FIG. 9 .
  • the receiving section 330 may be additionally configured to set a first weight ⁇ 1 corresponding to the pixel P 3,2 based on the distance W 1 and a second weight ⁇ 2 corresponding to the pixel P 4,2 based on the distance W 2 .
  • the first weight ⁇ 1 and the second weight ⁇ 2 may be set to be in proportional to or in inverse proportional to the calculated distances.
  • the receiving section 330 may be further configured to apply the first weight ⁇ 1 to the sampling data S 6,3 assigned to the pixel P 3,2 and to apply the second weight ⁇ 2 to the sampling data S 6,3 assigned to the pixel P 4,2 .
  • the receiving section 330 may be configured to perform the above process upon the remaining sampling data.
  • the receiving section 330 may be configured to perform the reception beam-forming upon the sampling data, which are cumulatively assigned to the respective pixels P a,b of the ultrasound image UI to form the reception-focused data.
  • the receiving section 330 may be configured to perform the analog-digital conversion upon the reception signals provided through the plurality of channels CH k from the ultrasound probe 310 to form the sampling data S i,j , as shown in FIG. 4 .
  • the sampling data S i,j may be stored in the storage unit 140 .
  • the receiving section 330 may be further configured to set a sampling data set based on the sampling data S i,j . That is, The receiving section 330 may set the sampling data set for selecting pixels, which the sampling data S i,j are used as the pixel data thereof, during the reception beam-forming.
  • the receiving section 330 may be configured to set the sampling data S 1,1 , S 1,4 , . . . S 1,t , S 2,1 , S 2,4 , . . . S 2,t , . . . S p,t as the sampling data set (denoted by a box) for selecting the pixels, which the sampling data S are used as the pixel data thereof, during the reception beam-forming, as shown in FIG. 10 .
  • the receiving section 330 may be further configured to detect the pixels corresponding to the respective sampling data of the sampling data set based on the positions of the elements and the positions (orientation) of the respective pixels of the ultrasound image UI with respect to the elements. That is, the receiving section 330 may select the pixels, which the respective sampling data of the sampling data set are used as the pixel data thereof, during the reception beam-forming based on the positions of the elements and the orientation of the respective pixels of the ultrasound image UI with respect to the elements.
  • the receiving section 330 may be further configured to cumulatively assign the sampling data to the selected pixels in the same manner with the above embodiments.
  • the receiving section 330 may be also configured to perform the reception beam-forming upon the sampling data, which are cumulatively assigned to the respective pixels of the ultrasound image UI to form the reception-focused data.
  • the receiving section 330 may be configured to perform a down-sampling upon the reception signals provided through the plurality of channels CH k from the ultrasound probe 310 to form down-sampling data.
  • the receiving section 330 may be further configured to detect the pixels corresponding to the respective sampling data, based on the positions of the elements and the positions (orientation) of the respective pixels of the ultrasound image UI with respect to the elements. That is, the receiving section 330 may select the pixels, which the respective sampling data are used as the pixel data thereof, during the reception beam-forming based on the positions of the elements and the orientation of the pixels of the ultrasound image UI with respect to the elements.
  • the receiving section 330 may be further configured to cumulatively assign the respective sampling data to the selected pixels in the same manner of the above embodiments.
  • the receiving section 330 may be further configured to perform the reception beam-forming upon the sampling data, which are cumulatively assigned to the respective pixels of the ultrasound image UI to form the reception-focused data.
  • reception beam-forming may not be limited thereto.
  • the ultrasound data acquiring unit 120 may further include an ultrasound data forming section 340 .
  • the ultrasound data forming section 340 may be configured to form the ultrasound data corresponding to the ultrasound image based on the reception-focused data provided from the receiving section 330 .
  • the ultrasound data forming section 340 may be further configured to perform a signal process (e.g., gain control, etc) upon the reception-focused data.
  • the ultrasound data forming section 340 may be configured to form ultrasound data (hereinafter referred to as “brightness mode ultrasound data”) corresponding to the brightness mode image BI based on the brightness mode reception-focused data provided from the receiving section 330 .
  • the brightness mode ultrasound data may include radio frequency data. However, it should be noted herein that the brightness mode ultrasound data may not be limited thereto.
  • the ultrasound data forming section 340 may be further configured to form ultrasound data (hereinafter referred to as “Doppler mode ultrasound data”) corresponding to the at least two sample volumes based on the Doppler mode reception-focused data provided from the receiving section 330 .
  • Doppler mode ultrasound data may include in-phase/quadrature data. However, it should be noted herein that the Doppler mode ultrasound data may not be limited thereto.
  • the ultrasound data forming section 340 may form first Doppler mode ultrasound data corresponding to the sample volume SV 1 based on the first Doppler mode reception-focused data provided from the receiving section 330 .
  • the ultrasound data forming section 340 may further form second Doppler mode ultrasound data corresponding to the sample volume SV 2 based on the second Doppler mode reception-focused data provided from the receiving section 330 .
  • the ultrasound data forming section 340 may further form third Doppler mode ultrasound data corresponding to the sample volume SV 3 based on the third Doppler mode reception-focused data provided from the receiving section 330 .
  • the ultrasound system 100 may further include a processing unit 130 in communication with the user input unit 110 and the ultrasound data acquiring unit 120 .
  • the processing unit 130 may include a central processing unit, a microprocessor, a graphic processing unit and the like.
  • FIG. 11 is a flow chart showing a process of forming a compound image of the Doppler spectrum images.
  • the processing unit 130 may be configured to form the brightness mode image BI based on the brightness mode ultrasound data provided from the ultrasound data acquiring unit 120 , at step S 1102 in FIG. 11 .
  • the brightness mode image BI may be displayed on a display unit 150 .
  • the processing unit 130 may be configured to set the at least two sample volumes on the brightness mode image BI based on the input information provided from the user input unit 110 , at step S 1104 in FIG. 11 .
  • the ultrasound data acquiring unit 120 may be configured to transmit the ultrasound signals to the living body and receive the ultrasound echo signals from the living body to acquire the Doppler mode ultrasound data corresponding to the at least two sample volumes.
  • the processing unit 130 may be configured to form Doppler signals corresponding to each of the at least two sample volumes based on the Doppler mode ultrasound data provided from the ultrasound data acquiring unit 120 , at step S 1106 in FIG. 11 .
  • the methods of forming the Doppler signals are well known in the art. Thus, they have not been described in detail so as not to unnecessarily obscure the present disclosure.
  • the processing unit 130 may form first Doppler signals corresponding to the sample volume SV 1 based on the first Doppler mode ultrasound data provided from the ultrasound data acquiring unit 120 .
  • the processing unit 130 may further form second Doppler signals corresponding to the sample volume SV 2 based on the second Doppler mode ultrasound data provided from the ultrasound data acquiring unit 120 .
  • the processing unit 130 may further form third Doppler signals corresponding to the sample volume SV 3 based on the third Doppler mode ultrasound data provided from the ultrasound data acquiring unit 120 .
  • the processing unit 130 may be configured to form the at least two Doppler spectrum images corresponding to the at least two sample volumes based on the Doppler signals corresponding to the at least two sample volumes, at step S 1108 in FIG. 11 .
  • the methods of forming the Doppler spectrum image are well known in the art. Thus, they have not been described in detail so as not to unnecessarily obscure the present disclosure.
  • the processing unit 130 may form a first Doppler spectrum image corresponding to the sample volume SV 1 based on the first Doppler signals corresponding to the sample volume SV 1 .
  • the processing unit 130 may further form a second Doppler spectrum image corresponding to the sample volume SV 2 based on the second Doppler signals corresponding to the sample volume SV 2 .
  • the processing unit 130 may further form a third Doppler spectrum image corresponding to the sample volume SV 3 based on the third Doppler signals corresponding to the sample volume SV 3 .
  • the processing unit 130 may be configured to perform an image process for forming a compound image of the at least two Doppler spectrum images upon the at least two Doppler spectrum images to form the compound image, at step S 1110 in FIG. 11 .
  • the image process may include an image process for adding pixel values (i.e., brightness values) corresponding to pixels of same positions on the at least two Doppler spectrum images, an image process for subtracting the pixel values (i.e., brightness values) corresponding to the pixels of the same positions on the at least two Doppler spectrum images, an image process for multiplying the pixel values (i.e., brightness values) corresponding to the pixels of the same positions on the at least two Doppler spectrum images, an image process for performing various blending processes among the at least two Doppler spectrum images and the like.
  • the processing unit 130 may detect the pixels of the same positions on the first Doppler spectrum image to the third Doppler spectrum image, based on positions (orientation) of pixels corresponding to each of the first Doppler spectrum image to the third Doppler spectrum image.
  • the processing unit 130 may further perform the image process for adding the pixel values corresponding to the detected pixels to form the compound image CI 1 as shown in FIG. 13 .
  • the processing unit 130 may detect the pixels of the same positions on the first Doppler spectrum image to the third Doppler spectrum image, based on the positions (orientation) of the pixels corresponding to each of the first Doppler spectrum image to the third Doppler spectrum image.
  • the processing unit 130 may further perform the image process for subtracting the pixel values corresponding to the detected pixels to form the compound image CI 2 as shown in FIG. 14 .
  • the processing unit 130 may detect the pixels of the same positions on the first Doppler spectrum image to the third Doppler spectrum image, based on the positions (orientation) of the pixels corresponding to each of the first Doppler spectrum image to the third Doppler spectrum image.
  • the processing unit 130 may further perform the image process for multiplying the pixel values corresponding to the detected pixels to form the compound image CI 3 as shown in FIG. 15 .
  • the processing unit 130 may be configured to perform a men trace upon the compound image.
  • the ultrasound system 100 may further include the storage unit 140 .
  • the storage unit 140 may store the ultrasound data (i.e., brightness mode ultrasound data and Doppler mode ultrasound data) acquired by the ultrasound data acquiring unit 120 .
  • the storage unit 140 may further store the Doppler signals formed by the processing unit 130 .
  • the storage unit 140 may further store the input information received by the user input unit 110 .
  • the storage unit 140 may further store the brightness mode image, the Doppler spectrum images, and the compound image.
  • the ultrasound system 100 may further include the display unit 150 .
  • the display unit 150 may be configured to display the brightness mode image formed by the processing unit 130 .
  • the display unit 150 may be further configured to display the Doppler spectrum images formed by the processing unit 130 .
  • the display unit 150 may be further configured to display the compound image formed by the processing unit 130 .

Abstract

There are provided embodiments for providing a compound image of Doppler spectrum images corresponding to at least two sample volumes. In one embodiment, by way of non-limiting example, an ultrasound system comprises: a processing unit configured to form at least two Doppler spectrum images corresponding to at least two sample volumes based on ultrasound data corresponding to the at least two sample volume, the processing unit being further configured to perform an image process for forming a compound image upon the at least two Doppler spectrum images to form the compound image.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS
The present application claims priority from Korean Patent Application No. 10-2011-0144445 filed on Dec. 28, 2011, the entire subject matter of which is incorporated herein by reference.
TECHNICAL FIELD
The present disclosure generally relates to ultrasound systems, and more particularly to providing a compound image of Doppler spectrum images in an ultrasound system.
BACKGROUND
An ultrasound system has become an important and popular diagnostic tool since it has a wide range of applications. Specifically, due to its non-invasive and non-destructive nature, the ultrasound system has been extensively used in the medical profession. Modern high-performance ultrasound systems and techniques are commonly used to produce two-dimensional or three-dimensional ultrasound images of internal features of target objects (e.g., human organs).
The ultrasound system may provide ultrasound images of various modes including a brightness mode image representing reflection coefficients of ultrasound signals (i.e., ultrasound echo signals) reflected from a target object of a living body with a two-dimensional image, a Doppler mode image representing velocity of a moving target object with spectral Doppler by using a Doppler effect, a color Doppler mode image representing velocity of the moving target object with colors by using the Doppler effect, an elastic image representing mechanical characteristics of tissues before and after applying compression thereto, and the like.
The ultrasound system may transmit the ultrasound signals to the living body and receive the ultrasound echo signals from the living body to form Doppler signals corresponding to a region of interest, which is set on the brightness mode image. The ultrasound system may further form the color Doppler mode image representing the velocity of the moving target object with colors based on the Doppler signals. In particular, the color Doppler image may represent the motion of the target object (e.g., blood flow) with the colors. The color Doppler image may be used to diagnose disease of a blood vessel, a heart and the like. However, it is difficult to represent an accurate motion of the target object (e.g., blood flow) since the respective colors indicated by a motion value is a function of the velocity of the target object, which moves forward in a transmission direction of the ultrasound signals and moves backward in the transmission direction of the ultrasound signals.
Particularly, the ultrasound system may set a sample volume on the brightness mode image, transmit ultrasound signals to the living body based on an ensemble number, and receive ultrasound echo signals from the living to form a Doppler spectrum image corresponding to the sample volume.
SUMMARY
There are provided embodiments for forming Doppler spectrum images corresponding to at least two sample volumes and compounding the Doppler spectrum images to provide a compound image.
In one embodiment, by way of non-limiting example, an ultrasound system comprises: a processing unit configured to form at least two Doppler spectrum images corresponding to at least two sample volumes based on ultrasound data corresponding to the at least two sample volume, the processing unit being further configured to perform an image process for forming a compound image upon the at least two Doppler spectrum images to form the compound image.
In another embodiment, there is provided a method of providing a compound image, comprising: a) forming at least two Doppler spectrum images corresponding to at least two sample volumes based on ultrasound data corresponding to the at least two sample volume; and b) performing an image process for forming a compound image upon the at least two Doppler spectrum images to form the compound image.
The Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key or essential features of the claimed subject matter, nor is it intended to be used in determining the scope of the claimed subject matter.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a block diagram showing an illustrative embodiment of an ultrasound system.
FIG. 2 is a schematic diagram showing an example of a brightness mode image and sample volumes.
FIG. 3 is a block diagram showing an illustrative embodiment of an ultrasound data acquiring unit.
FIG. 4 is a schematic diagram showing an example of sampling data and pixels of an ultrasound image.
FIGS. 5 to 8 are schematic diagrams showing examples of performing a reception beam-forming.
FIG. 9 is a schematic diagram showing an example of setting weights.
FIG. 10 is a schematic diagram showing an example of setting a sampling data set.
FIG. 11 is a flow chart showing a process of forming a compound image of Doppler spectrum images.
FIG. 12 is a schematic diagram showing an example of the sample volumes and the Doppler spectrum images.
FIGS. 13 to 15 are schematic diagrams showing examples of the compound images.
DETAILED DESCRIPTION
A detailed description may be provided with reference to the accompanying drawings. One of ordinary skill in the art may realize that the following description is illustrative only and is not in any way limiting. Other embodiments of the present invention may readily suggest themselves to such skilled persons having the benefit of this disclosure.
Referring to FIG. 1, an ultrasound system 100 in accordance with an illustrative embodiment is shown. As depicted therein, the ultrasound system 100 may include a user input unit 110.
The user input unit 110 may be configured to receive input information from a user. In one embodiment, the input information may include information for setting at least two sample volumes (e.g., SV1, SV2, SV3) on a brightness mode image BI, as shown in FIG. 2. That is, the input information may include the number, position and size of the sample volumes. However, it should be noted herein that the input information may not be limited thereto. In FIG. 2, the reference numeral BV represents a blood vessel. The user input unit 110 may include a control panel, a track ball, a touch screen, a mouse, a keyboard and the like.
The ultrasound system 100 may further include an ultrasound data acquiring unit 120. The ultrasound data acquiring unit 120 may be configured to transmit ultrasound signals to a living body. The living body may include target objects (e.g., blood vessel, heart, blood flow, etc). The ultrasound data acquiring unit 120 may be further configured to receive ultrasound signals (i.e., ultrasound echo signals) from the living body to acquire ultrasound data corresponding to an ultrasound image.
FIG. 3 is a block diagram showing an illustrative embodiment of the ultrasound data acquiring unit. Referring to FIG. 3, the ultrasound data acquiring unit 120 may include an ultrasound probe 310.
The ultrasound probe 310 may include a plurality of elements (not shown) for reciprocally converting between ultrasound signals and electrical signals. The ultrasound probe 310 may be configured to transmit the ultrasound signals to the living body. The ultrasound probe 310 may be further configured to receive the ultrasound echo signals from the living body to output electrical signals (hereinafter referred to as “reception signals”). The reception signals may be analog signals. The ultrasound probe 310 may include a convex probe, a linear probe and the like.
The ultrasound data acquiring unit 120 may further include a transmitting section 320. The transmitting section 320 may be configured to control the transmission of the ultrasound signals. The transmitting section 320 may be further configured to generate electrical signals (hereinafter referred to as “transmission signals”) in consideration of the elements.
In one embodiment, the transmitting section 320 may be configured to generate transmission signals (hereinafter referred to as “brightness mode transmission signals”) for obtaining the brightness mode image BI in consideration of the elements. Thus, the ultrasound probe 310 may be configured to convert the brightness mode transmission signals provided from the transmitting section 320 into the ultrasound signals, transmit the ultrasound signals to the living body, and receive the ultrasound echo signals from the living body to output reception signals (hereinafter referred to as “brightness mode reception signals”).
The transmitting section 320 may be further configured to generate transmission signals (hereinafter referred to as “Doppler mode transmission signals”) for obtaining Doppler spectrum images corresponding to the at least two sample volumes based on an ensemble number. The ensemble number may represent the number of transmitting and receiving the ultrasound signals. Thus, the ultrasound probe 310 may be configured to convert the Doppler mode transmission signals provided from the transmitting section 320 into the ultrasound signals and transmit the ultrasound signals to the living body. The ultrasound signals transmitted from the ultrasound probe 310 may be the plane wave signals. The ultrasound probe 310 may be further configured to receive the ultrasound echo signals from the living body in at least one reception direction to output reception signals (hereinafter referred to as “Doppler mode reception signals”).
The ultrasound data acquiring unit 120 may further include a receiving section 330. The receiving section 330 may be configured to perform an analog-digital conversion upon the reception signals provided from the ultrasound probe 310 to form sampling data. The receiving section 330 may be also configured to perform a reception beam-forming upon the sampling data in consideration of the elements to form reception-focused data. The reception beam-forming will be described below in detail.
In one embodiment, the receiving section 330 may be configured to perform the analog-digital conversion upon the brightness mode reception signals provided from the ultrasound probe 310 to form sampling data (hereinafter referred to as “brightness mode sampling data”). The receiving section 330 may be further configured to perform the reception beam-forming upon the brightness mode sampling data to form reception-focused data (hereinafter referred to as “brightness mode reception-focused data”).
The receiving section 330 may be further configured to perform the analog-digital conversion upon the Doppler mode reception signals provided from the ultrasound probe 310 to form sampling data (hereinafter referred to as “Doppler mode sampling data”). The receiving section 330 may be also configured to perform the reception beam-forming upon the Doppler mode sampling data to form reception-focused data (hereinafter referred to as “Doppler mode reception-focused data”) corresponding to the at least two sample volumes.
For example, the receiving section 330 may perform the reception beam-forming upon the Doppler mode sampling data to form first Doppler mode reception-focused data corresponding to the sample volume SV1. The receiving section 330 may further perform the reception beam-forming upon the Doppler mode sampling data to form second Doppler mode reception-focused data corresponding to the sample volume SV2. The receiving section 330 may also perform the reception beam-forming upon the Doppler mode sampling data to form third Doppler mode reception-focused data corresponding to the sample volume SV3.
The reception beam-forming may be described with reference to the accompanying drawings.
In one embodiment, the receiving section 330 may be configured to perform the analog-digital conversion upon the reception signals provided through a plurality of channels CHk, wherein 1≦k≦N, from the ultrasound probe 310 to form sampling data Si,j, wherein the i and j are a positive integer, as shown in FIG. 4. The sampling data Si,j may be stored in a storage unit 140. The receiving section 330 may be further configured to detect pixels corresponding to the sampling data based on positions of the elements and positions (orientation) of pixels of the ultrasound image UI with respect to the elements. That is, the receiving section 330 may select the pixels, which the respective sampling data are used as pixel data thereof, during the reception beam-forming based on the positions of the elements and the orientation of the respective pixels of the ultrasound image UI with respect to the elements. The receiving section 330 may be configured to cumulatively assign the sampling data corresponding to the selected pixels as the pixel data.
For example, the receiving section 330 may be configured to set a curve (hereinafter referred to as “reception beam-forming curve”) CV6,3 for selecting pixels, which the sampling data S6,3 are used as the pixel data thereof, during the reception beam-forming based on the positions of the elements and the orientation of the respective pixels of the ultrasound image UI with respect to the elements, as shown in FIG. 5. The receiving section 330 may be further configured to detect the pixels P3,1, P3,2, P4,2, P4,3, P4,4, P4,5, P4,6, P4,7, P4,8, P4,9, . . . P3,N corresponding to the reception beam-forming curve CV6,3 from the pixels Pa,b of the ultrasound image UI, wherein 1≦b≦N. That is, the receiving section 330 may select the pixels P3,1, P3,2, P4,2, P4,3, P4,4, P4,5, P4,6, P4,7, P4,8, P4,9, . . . P3,N on which the reception beam-forming curve CV6,3 passes among the pixels Pa,b of the ultrasound image UI. The receiving section 330 may be also configured to assign the sampling data S6,3 to the selected pixels P3,1, P3,2, P4,2, P4,3, P4,4, P4,5, P4,6, P4,7, P4,8, P4,9, . . . P3,N, as shown in FIG. 6.
Thereafter, the receiving section 330 may be configured to set a reception beam-forming curve CV6,4 for selecting pixels, which the sampling data S6,4 are used as the pixel data thereof, during the reception beam-forming based on the positions of the elements and the orientation of the respective pixels of the ultrasound image UI with respect to the elements, as shown in FIG. 7. The receiving section 330 may be further configured to detect the pixels P2,1, P3,1, P3,2, P4,2, P4,3, P4,4, P5,4, P5,5, P5,6, P5,7, P5,8, P4,9, P5,9, . . . P4,N, P3,N corresponding to the reception beam-forming curve CV6,4 from the pixels Pa,b of the ultrasound image UI. That is, the receiving section 330 may select the pixels P2,1, P3,1, P3,2, P4,2, P4,3, P4,4, P5,4, P5,5, P5,6, P5,7, P5,8, P4,9, P5,9, . . . P4,N, P3,N on which the reception beam-forming curve CV6,4 passes among the pixels Pa,b of the ultrasound image UI. The receiving section 330 may be further configured to assign the sampling data S6,4 to the selected pixels P2,1, P3,1, P3,2, P4,2, P4,3, P4,4, P5,4, P5,5, P5,6, P5,7, P5,8, P5,9, . . . P4,N, P3,N, as shown in FIG. 8. In this way, the respective sampling data, which are used as the pixel data, may be cumulatively assigned to the pixels as the pixel data.
The receiving section 330 may be configured to perform the reception beam-forming (i.e., summing) upon the sampling data, which are cumulatively assigned to the respective pixels Pa,b of the ultrasound image UI to form the reception-focused data.
In another embodiment, the receiving section 330 may be configured to perform the analog-digital conversion upon the reception signals provided through the plurality of channels CHk from the ultrasound probe 310 to form the sampling data Si,j, as shown in FIG. 4. The sampling data Si,j may be stored in the storage unit 140. The receiving section 330 may be further configured to detect pixels corresponding to the sampling data based on the positions of the elements and the position (orientation) of the pixels of the ultrasound image UI with respect to the elements. That is, the receiving section 330 may select the pixels, which the respective sampling data are used as the pixel data thereof, during the reception beam-forming based on the positions of the elements and the orientation of the respective pixels of the ultrasound image UI with respect to the elements. The receiving section 330 may be configured to cumulatively assign the sampling data corresponding to the selected pixels as the pixel data. The receiving section 330 may be further configured to determine pixels existing in the same column among the selected pixels. The receiving section 330 may be also configured to set weights corresponding to the respective determined pixels. The receiving section 330 may be additionally configured to apply the weights to the sampling data of the respective pixels.
For example, the receiving section 330 may be configured to set the reception beam-forming curve CV6,3 for selecting pixels, which the sampling data S6,3 are used as the pixel data thereof, during the reception beam-forming based on the positions of the elements and the orientation of the respective pixels of the ultrasound image UI with respect to the elements, as shown in FIG. 5. The receiving section 330 may be further configured to detect the pixels P3,1, P3,2, P4,2, P4,3, P4,4, P4,5, P4,6, P4,7, P4,8, P4,9, . . . P3,N corresponding to the reception beam-forming curve CV6,3 from the pixels Pa,b of the ultrasound image UI, wherein 1≦a≦M, 1≦b≦N. That is, the receiving section 330 may select the pixels P3,1, P3,2, P4,2, P4,3, P4,4, P4,5, P4,6, P4,7, P4,8, P4,9, . . . PIN on which the reception beam-forming curve CV6,3 passes among the pixels Pa,b of the ultrasound image UI. The receiving section 330 may be also configured to assign the sampling data S6,3 to the selected pixels P3,1, P3,2, P4,2, P4,3, P4,4, P4,5, P4,6, P4,7, P4,8, P4,9, . . . P3,N, as shown in FIG. 6. The receiving section 330 may be further configured to determine pixels P3,2 and P4,2, which exist in the same column among the selected pixels P3,1, P3,2, P4,2, P4,3, P4,4, P4,5, P4,6, P4,7, P4,8, P4,9, . . . P3,N. The receiving section 330 may be further configured to calculate a distance W1 from a center of the determined pixel P3,2 to the reception beam-forming curve CV6,3 and a distance W2 from a center of the determined pixel P4,2 to the reception beam-forming curve CV6,3, as shown in FIG. 9. The receiving section 330 may be additionally configured to set a first weight α1 corresponding to the pixel P3,2 based on the distance W1 and a second weight α2 corresponding to the pixel P4,2 based on the distance W2. The first weight α1 and the second weight α2 may be set to be in proportional to or in inverse proportional to the calculated distances. The receiving section 330 may be further configured to apply the first weight α1 to the sampling data S6,3 assigned to the pixel P3,2 and to apply the second weight α2 to the sampling data S6,3 assigned to the pixel P4,2. The receiving section 330 may be configured to perform the above process upon the remaining sampling data.
The receiving section 330 may be configured to perform the reception beam-forming upon the sampling data, which are cumulatively assigned to the respective pixels Pa,b of the ultrasound image UI to form the reception-focused data.
In yet another embodiment, the receiving section 330 may be configured to perform the analog-digital conversion upon the reception signals provided through the plurality of channels CHk from the ultrasound probe 310 to form the sampling data Si,j, as shown in FIG. 4. The sampling data Si,j may be stored in the storage unit 140. The receiving section 330 may be further configured to set a sampling data set based on the sampling data Si,j. That is, The receiving section 330 may set the sampling data set for selecting pixels, which the sampling data Si,j are used as the pixel data thereof, during the reception beam-forming.
For example, the receiving section 330 may be configured to set the sampling data S1,1, S1,4, . . . S1,t, S2,1, S2,4, . . . S2,t, . . . Sp,t as the sampling data set (denoted by a box) for selecting the pixels, which the sampling data S are used as the pixel data thereof, during the reception beam-forming, as shown in FIG. 10.
The receiving section 330 may be further configured to detect the pixels corresponding to the respective sampling data of the sampling data set based on the positions of the elements and the positions (orientation) of the respective pixels of the ultrasound image UI with respect to the elements. That is, the receiving section 330 may select the pixels, which the respective sampling data of the sampling data set are used as the pixel data thereof, during the reception beam-forming based on the positions of the elements and the orientation of the respective pixels of the ultrasound image UI with respect to the elements. The receiving section 330 may be further configured to cumulatively assign the sampling data to the selected pixels in the same manner with the above embodiments. The receiving section 330 may be also configured to perform the reception beam-forming upon the sampling data, which are cumulatively assigned to the respective pixels of the ultrasound image UI to form the reception-focused data.
In yet another embodiment, the receiving section 330 may be configured to perform a down-sampling upon the reception signals provided through the plurality of channels CHk from the ultrasound probe 310 to form down-sampling data. As described above, the receiving section 330 may be further configured to detect the pixels corresponding to the respective sampling data, based on the positions of the elements and the positions (orientation) of the respective pixels of the ultrasound image UI with respect to the elements. That is, the receiving section 330 may select the pixels, which the respective sampling data are used as the pixel data thereof, during the reception beam-forming based on the positions of the elements and the orientation of the pixels of the ultrasound image UI with respect to the elements. The receiving section 330 may be further configured to cumulatively assign the respective sampling data to the selected pixels in the same manner of the above embodiments. The receiving section 330 may be further configured to perform the reception beam-forming upon the sampling data, which are cumulatively assigned to the respective pixels of the ultrasound image UI to form the reception-focused data.
However, it should be noted herein that the reception beam-forming may not be limited thereto.
Referring back to FIG. 3, the ultrasound data acquiring unit 120 may further include an ultrasound data forming section 340. The ultrasound data forming section 340 may be configured to form the ultrasound data corresponding to the ultrasound image based on the reception-focused data provided from the receiving section 330. The ultrasound data forming section 340 may be further configured to perform a signal process (e.g., gain control, etc) upon the reception-focused data.
In one embodiment, the ultrasound data forming section 340 may be configured to form ultrasound data (hereinafter referred to as “brightness mode ultrasound data”) corresponding to the brightness mode image BI based on the brightness mode reception-focused data provided from the receiving section 330. The brightness mode ultrasound data may include radio frequency data. However, it should be noted herein that the brightness mode ultrasound data may not be limited thereto.
The ultrasound data forming section 340 may be further configured to form ultrasound data (hereinafter referred to as “Doppler mode ultrasound data”) corresponding to the at least two sample volumes based on the Doppler mode reception-focused data provided from the receiving section 330. The Doppler mode ultrasound data may include in-phase/quadrature data. However, it should be noted herein that the Doppler mode ultrasound data may not be limited thereto.
For example, the ultrasound data forming section 340 may form first Doppler mode ultrasound data corresponding to the sample volume SV1 based on the first Doppler mode reception-focused data provided from the receiving section 330. The ultrasound data forming section 340 may further form second Doppler mode ultrasound data corresponding to the sample volume SV2 based on the second Doppler mode reception-focused data provided from the receiving section 330. The ultrasound data forming section 340 may further form third Doppler mode ultrasound data corresponding to the sample volume SV3 based on the third Doppler mode reception-focused data provided from the receiving section 330.
Referring back to FIG. 1, the ultrasound system 100 may further include a processing unit 130 in communication with the user input unit 110 and the ultrasound data acquiring unit 120. The processing unit 130 may include a central processing unit, a microprocessor, a graphic processing unit and the like.
FIG. 11 is a flow chart showing a process of forming a compound image of the Doppler spectrum images. The processing unit 130 may be configured to form the brightness mode image BI based on the brightness mode ultrasound data provided from the ultrasound data acquiring unit 120, at step S 1102 in FIG. 11. The brightness mode image BI may be displayed on a display unit 150.
The processing unit 130 may be configured to set the at least two sample volumes on the brightness mode image BI based on the input information provided from the user input unit 110, at step S1104 in FIG. 11. Thus, the ultrasound data acquiring unit 120 may be configured to transmit the ultrasound signals to the living body and receive the ultrasound echo signals from the living body to acquire the Doppler mode ultrasound data corresponding to the at least two sample volumes.
The processing unit 130 may be configured to form Doppler signals corresponding to each of the at least two sample volumes based on the Doppler mode ultrasound data provided from the ultrasound data acquiring unit 120, at step S1106 in FIG. 11. The methods of forming the Doppler signals are well known in the art. Thus, they have not been described in detail so as not to unnecessarily obscure the present disclosure.
For example, the processing unit 130 may form first Doppler signals corresponding to the sample volume SV1 based on the first Doppler mode ultrasound data provided from the ultrasound data acquiring unit 120. The processing unit 130 may further form second Doppler signals corresponding to the sample volume SV2 based on the second Doppler mode ultrasound data provided from the ultrasound data acquiring unit 120. The processing unit 130 may further form third Doppler signals corresponding to the sample volume SV3 based on the third Doppler mode ultrasound data provided from the ultrasound data acquiring unit 120.
The processing unit 130 may be configured to form the at least two Doppler spectrum images corresponding to the at least two sample volumes based on the Doppler signals corresponding to the at least two sample volumes, at step S 1108 in FIG. 11. The methods of forming the Doppler spectrum image are well known in the art. Thus, they have not been described in detail so as not to unnecessarily obscure the present disclosure.
For example, the processing unit 130 may form a first Doppler spectrum image corresponding to the sample volume SV1 based on the first Doppler signals corresponding to the sample volume SV1. The processing unit 130 may further form a second Doppler spectrum image corresponding to the sample volume SV2 based on the second Doppler signals corresponding to the sample volume SV2. The processing unit 130 may further form a third Doppler spectrum image corresponding to the sample volume SV3 based on the third Doppler signals corresponding to the sample volume SV3.
The processing unit 130 may be configured to perform an image process for forming a compound image of the at least two Doppler spectrum images upon the at least two Doppler spectrum images to form the compound image, at step S1110 in FIG. 11. In one embodiment, the image process may include an image process for adding pixel values (i.e., brightness values) corresponding to pixels of same positions on the at least two Doppler spectrum images, an image process for subtracting the pixel values (i.e., brightness values) corresponding to the pixels of the same positions on the at least two Doppler spectrum images, an image process for multiplying the pixel values (i.e., brightness values) corresponding to the pixels of the same positions on the at least two Doppler spectrum images, an image process for performing various blending processes among the at least two Doppler spectrum images and the like.
As one example, the processing unit 130 may detect the pixels of the same positions on the first Doppler spectrum image to the third Doppler spectrum image, based on positions (orientation) of pixels corresponding to each of the first Doppler spectrum image to the third Doppler spectrum image. The processing unit 130 may further perform the image process for adding the pixel values corresponding to the detected pixels to form the compound image CI1 as shown in FIG. 13.
As another example, the processing unit 130 may detect the pixels of the same positions on the first Doppler spectrum image to the third Doppler spectrum image, based on the positions (orientation) of the pixels corresponding to each of the first Doppler spectrum image to the third Doppler spectrum image. The processing unit 130 may further perform the image process for subtracting the pixel values corresponding to the detected pixels to form the compound image CI2 as shown in FIG. 14.
As yet another example, the processing unit 130 may detect the pixels of the same positions on the first Doppler spectrum image to the third Doppler spectrum image, based on the positions (orientation) of the pixels corresponding to each of the first Doppler spectrum image to the third Doppler spectrum image. The processing unit 130 may further perform the image process for multiplying the pixel values corresponding to the detected pixels to form the compound image CI3 as shown in FIG. 15.
Optionally, the processing unit 130 may be configured to perform a men trace upon the compound image.
Referring back to FIG. 1, the ultrasound system 100 may further include the storage unit 140. The storage unit 140 may store the ultrasound data (i.e., brightness mode ultrasound data and Doppler mode ultrasound data) acquired by the ultrasound data acquiring unit 120. The storage unit 140 may further store the Doppler signals formed by the processing unit 130. The storage unit 140 may further store the input information received by the user input unit 110. The storage unit 140 may further store the brightness mode image, the Doppler spectrum images, and the compound image.
The ultrasound system 100 may further include the display unit 150. The display unit 150 may be configured to display the brightness mode image formed by the processing unit 130. The display unit 150 may be further configured to display the Doppler spectrum images formed by the processing unit 130. The display unit 150 may be further configured to display the compound image formed by the processing unit 130.
Although embodiments have been described with reference to a number of illustrative embodiments thereof, it should be understood that numerous other modifications and embodiments can be devised by those skilled in the art that will fall within the spirit and scope of the principles of this disclosure. More particularly, numerous variations and modifications are possible in the component parts and/or arrangements of the subject combination arrangement within the scope of the disclosure, the drawings and the appended claims. In addition to variations and modifications in the component parts and/or arrangements, alternative uses will also be apparent to those skilled in the art.

Claims (19)

What is claimed is:
1. An ultrasound system, comprising: an ultrasound data acquiring unit, including an ultrasound probe, configured to transmit ultrasound signals to a living body in at least one transmission direction and receive ultrasound echo signals from the living body in at least one reception direction to acquire ultrasound data corresponding to a first sample volume and a second sample volume different from the first sample volume;
a processor configured to form a first Doppler spectrum image corresponding to the first sample volume and a second Doppler spectrum image corresponding to the second sample volume based on the ultrasound data corresponding to the first sample volume and the second sample volume, the processor being further configured to perform an image process for forming a compound image upon the first Doppler spectrum image and the second Doppler spectrum image to form the compound image; and
a display configured to display the compound image,
wherein the image process includes at least one of a first image process for adding pixel values corresponding to pixels of same positions on the first Doppler spectrum image and the second Doppler spectrum image, a second image process for subtracting the pixel values corresponding to the pixels of the same positions on the first Doppler spectrum image and the second Doppler spectrum image, a third image process for multiplying the pixel values corresponding to the pixels of the same positions on the first Doppler spectrum image and the second Doppler spectrum image, and a fourth image process for performing a blending process among the pixels of the same position on the first Doppler spectrum image and the second Doppler spectrum image.
2. The ultrasound system of claim 1, wherein the ultrasound signals include unfocused signals or focused signals.
3. The ultrasound system of claim 1, wherein the ultrasound data acquiring unit is configured to:
form reception signals based on the ultrasound echo signals;
perform an analog-digital conversion upon the reception signals to form a plurality of sampling data;
detect pixels corresponding to each of the sampling data from the pixels of the first Doppler spectrum image and the second Doppler spectrum image to cumulatively assign the sampling data to the detected pixels;
perform reception beam-forming upon the sampling data assigned to the detected pixels to form reception-focused data corresponding to the first sample volume and the second sample volume; and
form the ultrasound data corresponding to the first sample volume and the second sample volume based on the reception-focused data.
4. The ultrasound system of claim 3, wherein the ultrasound data acquiring unit is configured to:
set a beam-forming curve for selecting pixels which the respective sampling data are used as pixel data thereof; and
select the pixels corresponding to the beam-forming curve.
5. The ultrasound system of claim 3, wherein the ultrasound data acquiring unit is further configured to:
determine pixels existing in the same column of the first Doppler spectrum image and the second Doppler spectrum image among the selected pixels;
set weights corresponding to the respective determined pixels; and
apply the weights to the sampling data of the respective determined pixels.
6. The ultrasound system of claim 5, wherein the ultrasound data acquiring unit is further configured to:
calculate distances from a center of the respective determined pixels to the beam-forming curve; and
set the weights based on the calculated distances.
7. The ultrasound system of claim 6, wherein the weights are set to be in proportional to or inverse proportional to the calculated distances.
8. The ultrasound system of claim 3, wherein the ultrasound data acquiring unit is further configured to:
set a sampling data set for selecting pixels which the respective sampling data are used as pixels data thereof among the sampling data; and
select pixels corresponding to respective sampling data of the sampling data set.
9. The ultrasound system of claim 3, wherein the ultrasound data acquiring unit is further configured to:
perform a down-sampling process upon the reception signals to form down-sampled data.
10. A method of providing a compound image, comprising:
a) forming a first Doppler spectrum image corresponding to a first sample volume and a second Doppler spectrum image corresponding to a second sample volume different from the first sample volume based on ultrasound data corresponding to the first sample volume and the second sample volume; and
b) performing an image process for forming a compound image upon the first Doppler spectrum image and the second Doppler spectrum image to form the compound image, wherein the image process includes at least one of a first image process for adding pixel values corresponding to pixels of same positions on the first Doppler spectrum image and the second Doppler spectrum image, a second image process for subtracting the pixel values corresponding to the pixels of the same positions on the first Doppler spectrum image and the second Doppler spectrum image, a third image process for multiplying the pixel values corresponding to the pixels of the same positions on the first Doppler spectrum image and the second Doppler spectrum image, and a fourth image process for performing a blending process among the pixels of the same position on the first Doppler spectrum image and the second Doppler spectrum image.
11. The method of claim 10, further comprising:
transmitting ultrasound signals to a living body including the target object in at least one transmission direction and receiving ultrasound echo signals from the living body in at least one reception direction to acquire the ultrasound data corresponding to the first sample volume and the second sample volume, prior to performing the step a).
12. The method of claim 11, wherein the ultrasound signals include unfocused signals or focused signals.
13. The method of claim 11, wherein the step of acquiring the ultrasound data, further comprises:
forming reception signals based on the ultrasound echo signals;
performing an analog-digital conversion upon the reception signals to form a plurality of sampling data;
detecting pixels corresponding to each of the sampling data from the pixels of the first Doppler spectrum image and the second Doppler spectrum image to cumulatively assign the sampling data to detected pixels;
performing reception beam-forming upon the sampling data assigned to the detected pixels to form reception-focused data corresponding to the first sample volume and the second sample volume; and
forming the ultrasound data corresponding to each of the first sample volume and the second sample volume based on the reception-focused data.
14. The method of claim 13, wherein the step of detecting the pixels corresponding to each of the sampling data, comprises:
setting a beam-forming curve for selecting pixels which the respective sampling data are used as pixel data thereof; and
selecting the pixels corresponding to the beam-forming curve.
15. The method of claim 14, wherein the step of acquiring the ultrasound data, further comprises:
determining pixels existing in the same column of the first Doppler spectrum image and the second Doppler spectrum image among the selected pixels;
setting weights corresponding to the respective determined pixels; and
applying the weights to the sampling data of the respective determined pixels.
16. The method of claim 15, wherein the step of setting the weights, comprises:
calculating distances from a center of the respective determined pixels to the beam-forming curve; and
setting the weights based on the calculated distances.
17. The method of claim 16, wherein the weights are set to be in proportional to or inverse proportional to the calculated distances.
18. The method of claim 14, wherein the step of detecting the pixels corresponding to each of the sampling data, further comprises:
setting a sampling data set for selecting pixels, which the respective sampling data are used as pixel data thereof, among the sampling data; and
selecting pixels corresponding to respective sampling data of the sampling data set.
19. The method of claim 14, wherein the step of detecting the pixels corresponding to each of the sampling data, further comprises:
performing a down-sampling process upon the reception signals to form down-sampled data.
US13/730,033 2011-12-28 2012-12-28 Providing compound image of doppler spectrum images in ultrasound system Active 2034-03-28 US9510803B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2011-0144445 2011-12-28
KR1020110144445A KR101348770B1 (en) 2011-12-28 2011-12-28 Ultrasound system and method for providing compound image of doppler spectrum images

Publications (2)

Publication Number Publication Date
US20130172754A1 US20130172754A1 (en) 2013-07-04
US9510803B2 true US9510803B2 (en) 2016-12-06

Family

ID=48695414

Family Applications (1)

Application Number Title Priority Date Filing Date
US13/730,033 Active 2034-03-28 US9510803B2 (en) 2011-12-28 2012-12-28 Providing compound image of doppler spectrum images in ultrasound system

Country Status (2)

Country Link
US (1) US9510803B2 (en)
KR (1) KR101348770B1 (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9877699B2 (en) 2012-03-26 2018-01-30 Teratech Corporation Tablet ultrasound system
US10667790B2 (en) 2012-03-26 2020-06-02 Teratech Corporation Tablet ultrasound system
WO2015080317A1 (en) * 2013-11-29 2015-06-04 알피니언메디칼시스템 주식회사 Method and apparatus for compounding ultrasonic images
KR102120796B1 (en) * 2014-05-13 2020-06-09 삼성전자주식회사 A beamforming apparatus, a method for forming beams, an ultrasonic imaging apparatus and an ultrasonic probe
WO2018000342A1 (en) * 2016-06-30 2018-01-04 深圳迈瑞生物医疗电子股份有限公司 Method and system for ultrasonic fluid spectral doppler imaging
KR102447020B1 (en) * 2016-09-20 2022-09-26 삼성메디슨 주식회사 Apparatus and method for displaying ultrasound image

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4398540A (en) * 1979-11-05 1983-08-16 Tokyo Shibaura Denki Kabushiki Kaisha Compound mode ultrasound diagnosis apparatus
KR100271469B1 (en) 1997-02-25 2001-01-15 이민화 Digital scan converter of ultrasonic scan system
US20060030776A1 (en) * 2004-08-09 2006-02-09 General Electric Company Range dependent weighting for spatial compound imaging
US20060052704A1 (en) * 2004-09-07 2006-03-09 Tatsuro Baba Ultrasonic doppler diagnostic apparatus and measuring method of diagnostic parameter
KR20060115595A (en) 2005-05-04 2006-11-09 주식회사 메디슨 Method and system for rendering volume data
JP2007160120A (en) 2005-12-16 2007-06-28 Medison Co Ltd Ultrasound system and ultrasound image formation method
KR100874550B1 (en) 2006-11-24 2008-12-16 주식회사 메디슨 Ultrasound system provides Doppler spectrum of multiple sample volumes
US20090149759A1 (en) * 2007-12-05 2009-06-11 Kabushiki Kaisha Toshiba Ultrasonic imaging apparatus and a method of generating ultrasonic images
KR20100060852A (en) 2008-11-28 2010-06-07 (주)메디슨 Ultrasound method and system for operating spectral doppler compound imaging
US20100260398A1 (en) * 2009-04-14 2010-10-14 Sonosite, Inc. Systems and methods for adaptive volume imaging

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4398540A (en) * 1979-11-05 1983-08-16 Tokyo Shibaura Denki Kabushiki Kaisha Compound mode ultrasound diagnosis apparatus
KR100271469B1 (en) 1997-02-25 2001-01-15 이민화 Digital scan converter of ultrasonic scan system
US20060030776A1 (en) * 2004-08-09 2006-02-09 General Electric Company Range dependent weighting for spatial compound imaging
US20060052704A1 (en) * 2004-09-07 2006-03-09 Tatsuro Baba Ultrasonic doppler diagnostic apparatus and measuring method of diagnostic parameter
KR20060115595A (en) 2005-05-04 2006-11-09 주식회사 메디슨 Method and system for rendering volume data
JP2007160120A (en) 2005-12-16 2007-06-28 Medison Co Ltd Ultrasound system and ultrasound image formation method
US20070167790A1 (en) 2005-12-16 2007-07-19 Medison Co., Ltd. Ultrasound diagnostic system and method for displaying doppler spectrum images of multiple sample volumes
KR100874550B1 (en) 2006-11-24 2008-12-16 주식회사 메디슨 Ultrasound system provides Doppler spectrum of multiple sample volumes
US20090149759A1 (en) * 2007-12-05 2009-06-11 Kabushiki Kaisha Toshiba Ultrasonic imaging apparatus and a method of generating ultrasonic images
KR20100060852A (en) 2008-11-28 2010-06-07 (주)메디슨 Ultrasound method and system for operating spectral doppler compound imaging
US20100260398A1 (en) * 2009-04-14 2010-10-14 Sonosite, Inc. Systems and methods for adaptive volume imaging

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
Holly L. Gorton, Biological Action Spectra, http://photobiology.info/Gorton.html, Nov. 1, 2010. *
Korean Notice of Allowance with English translation issued in Korean Application No. 10-2011-0144445 issued on Nov. 28, 2013.
Korean Office Action Issued in Korean Application No. 10-2011-0144445 mailed May 15, 2013.
Stoylen, http://folk.ntnu.no/stoylen/strainrate/Ultrasound, Captured on Sep. 29, 2011, but the webpage was updated on Nov. 2010. *

Also Published As

Publication number Publication date
KR101348770B1 (en) 2014-01-07
US20130172754A1 (en) 2013-07-04
KR20130076042A (en) 2013-07-08

Similar Documents

Publication Publication Date Title
US20130172745A1 (en) Providing vector doppler image based on decision data in ultrasound system
US20130172749A1 (en) Providing doppler spectrum images corresponding to at least two sample volumes in ultrasound system
US11406362B2 (en) Providing user interface in ultrasound system
US20130165784A1 (en) Providing motion profile information of target object in ultrasound system
US9510803B2 (en) Providing compound image of doppler spectrum images in ultrasound system
US20130172755A1 (en) Providing turbulent flow information based on vector doppler in ultrasound system
US9261485B2 (en) Providing color doppler image based on qualification curve information in ultrasound system
US9474510B2 (en) Ultrasound and system for forming an ultrasound image
US9232932B2 (en) Providing motion mode image in ultrasound system
US20130172747A1 (en) Estimating motion of particle based on vector doppler in ultrasound system
US20120095342A1 (en) Providing an ultrasound spatial compound image based on center lines of ultrasound images in an ultrasound system
US20130165792A1 (en) Forming vector information based on vector doppler in ultrasound system
US9078590B2 (en) Providing additional information corresponding to change of blood flow with a time in ultrasound system
US20130172744A1 (en) Providing particle flow image in ultrasound system
US20110282205A1 (en) Providing at least one slice image with additional information in an ultrasound system
US20130165793A1 (en) Providing doppler information of target object based on vector doppler in ultrasound system
KR20130075486A (en) Ultrasound system and method for dectecting vecotr information based on transmitting delay

Legal Events

Date Code Title Description
AS Assignment

Owner name: SAMSUNG MEDISON CO., LTD., KOREA, REPUBLIC OF

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KIM, HYOUNG JIN;REEL/FRAME:029542/0098

Effective date: 20121109

STCF Information on status: patent grant

Free format text: PATENTED CASE

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4