WO2001029772A1 - Perspective with shear warp - Google Patents

Perspective with shear warp Download PDF

Info

Publication number
WO2001029772A1
WO2001029772A1 PCT/CA2000/001184 CA0001184W WO0129772A1 WO 2001029772 A1 WO2001029772 A1 WO 2001029772A1 CA 0001184 W CA0001184 W CA 0001184W WO 0129772 A1 WO0129772 A1 WO 0129772A1
Authority
WO
WIPO (PCT)
Prior art keywords
viewing
image
data
volume data
major axis
Prior art date
Application number
PCT/CA2000/001184
Other languages
French (fr)
Inventor
Vittorio Accomazzi
Original Assignee
Cedara Software Corp.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cedara Software Corp. filed Critical Cedara Software Corp.
Priority to AU77659/00A priority Critical patent/AU7765900A/en
Publication of WO2001029772A1 publication Critical patent/WO2001029772A1/en
Priority to US10/122,148 priority patent/US20030012419A1/en
Priority to US10/792,126 priority patent/US7031505B2/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/08Volume rendering

Definitions

  • the present invention relates to the field of three-dimensional (3-D) image rendering, and more particularly to fast volume rendering using factorization.
  • volume visualization applications Real-time rendering of 3-D images in volume visualization applications has become increasingly important. This is particularly useful in clinical applications for the display and analysis of volumetric data sets acquired by imaging methods such as Computed Tomography (CT), Magnetic Resonance Imaging (MRI) or Ultrasonic imaging.
  • CT Computed Tomography
  • MRI Magnetic Resonance Imaging
  • Ultrasonic imaging Benefits of volume visualization include the ability to obtain oblique views for the increased understanding of complex geometric structures within organs, and the ability to measure volumes, areas, and distances. Volume visualization also provides the ability to explore the spatial relationship between an organ and its surrounding structures or tissues.
  • generation of 3-D images includes the steps of data acquisition, volume reconstruction, and image visualization. Volume rendering is a technique that generates a two-dimensional (2-D) projection directly from the 3-D volume data without requiring any intermediate geometrical data structure.
  • volume rendering to medical imaging and scientific visualization is limited because it is computationally expensive to generate acceptable frame rates.
  • High quality images would take tens of seconds or even minutes to generate using typical workstation technology.
  • Various approaches have been tried in an attempt to improve the response time.
  • One approach is to trade quality for speed. This approach is ineffective because many perceptual components of the image are obscured from the view of the user.
  • a more costly approach is to rely on special-purpose hardware, which drastically reduces the flexibility of the system.
  • Another approach relies on brute force and simply attempts to render an image by utilizing large multiprocessor computers to perform the rendering in parallel. These types of computers, however, are very expensive and require a large number of processors to achieve acceptable frame rates.
  • a further approach is to provide better, faster volume rendering algorithms.
  • Several existing volume-rendering algorithms operate by sampling the 2-D slices of the 3-D volume data (also known as factorization), at a fixed resolution; shearing the sampled slices to form an intermediate but distorted image; and warping the intermediate image in 2D to form an undistorted final image
  • An example of such a technique is described m United States Patent No 5,787,889 titled "Ultrasound imaging with real-time 3d image reconstruction and visualization"
  • the image visualization process derives 2D image projections of the 3D image
  • the stack of 2D images is known as the "stack space"
  • a shear wa ⁇ factorization process is used to derive the new 2D projection for one or more video frames of the image
  • the processor factonzes the necessary viewing transformation mat ⁇ x into a 3D shear, which is parallel to slices of the volume data
  • a projection of the shear forms a 2D intermediate image
  • a 2D wa ⁇ is implemented to produce the final image, (that is, a 2D projection of the 3D volume at a desired viewing angle)
  • the reconstructed volume is sheared by transforming the reference coordinate system to an intermediate coordinate system This simplifies mapping of the data
  • the intermediate coordinate system also is referred to as "sheared object space"
  • the sheared object space is selected so that all viewing rays are parallel to one of the axes of the o ⁇ ginal coordinate system for the volume (e g , the reference coordinate system)
  • Figure 1(a) depicts the transformation into sheared object space for parallel projection
  • the volume is retrieved as a set 16 of volume slices
  • the shearing direction for the volume is parallel to the set 16 of slices
  • the set of slices is resampled to be parallel to the shea ⁇ ng direction
  • the slices 16 then are translated and resampled to achieve image projection rays 19 which are pe ⁇ endicular to the sheared slices 18
  • the intermediate image projection plane is parallel to the slices of the volume Since all the projection rays 19 are pe ⁇ endicular to both
  • Figure 1(b) illustrates perspective projection and is represented by the numeral 20
  • the slices 16 are scaled m addition to being sheared to achieve sheared object space slices 24
  • the scaling produces an effect that allows an observer to perceive the volume with perspective
  • Such a method is best desc ⁇ bed in a paper titled " Fast Volume Rendenng Using a Shear-Wa ⁇ Factorization of the Viewing Transformation", Technical Report CSL-TR-95-678, Departments of Electrical Engineering and Computer Science, Stanford University, September 1995.
  • the major axis is not unique for all the rays in the viewing frustum. As previously mentioned, the major axis is used for factorization. Therefore, since there are three coordinate axes, there are three possible major axes, and a scene may be rendered in as many as three different factorizations. Each rendering process produces an image patch and joining, or stitching, the patches together create the final scene.
  • current implementations also do not allow the observer to view the image from a position within the data set; a position, which can provide valuable information to the observer.
  • An advantage of the present invention is a method for generating a 2-D image with perspective using shear- wa ⁇ factorization that uses one copy of the volume data.
  • a further advantage of the invention is the volume data may be accessed in storage order and which has the further advantage of accessing memory once for every slice.
  • a method for generating an 2-D projection directly from an 3-D volume data comprising the steps of:
  • a system for generating in substantially realtime a 2-D image projection directly from a 3-D volume data in response to input from a user of the system comprises: a) a memory for sto ⁇ ng the volume data, b) a processor for facto ⁇ ng and rendenng image data selected from the volume data, c) a user interface for providing the processor with an image parameter, the image parameter to be used in generating the image projection, d) a display for displaying the image projection provided by the processor, and e) a refined grid used by the processor for sampling the image data, wherein the image parameter facilitates determination of a resolution of the refined g ⁇ d
  • a system for generating an 2-D projection directly from an 3-D volume data the system including a microprocessor programmed for a system for generating an 2-D image projection directly from a 3-D volume data, the system compnsing a microprocessor
  • Figure 1(a) is an illustration of shear wa ⁇ factonzation with parallel projection according to the p ⁇ or art
  • FIGS 3(a) and (b) are schematic diagram showing the coordinate systems used in an embodiment of the present invention.
  • Figures 4(a) and (b) is a schematic diagram of a cross-section of a scene to be rendered and its associated viewing frustum
  • Figure 5 is a flow chart illustrating the steps involved m the shear wa ⁇ process according to an embodiment of the invention
  • Figure 6 is another schematic diagram of a cross-section of a scene to be rendered
  • Figure 7 is schematic diagram of the image m figure 6 after shearing
  • Figure 8 is a schematic graph illustrating the mapping of two related points to the same result
  • Figure 9 is a schematic diagram of the cross-section of the scene m figure 4(a) aftei resampling
  • Figure 10 is a schematic diagram of the cross-section of the scene m figure 9 after shea ⁇ ng.
  • Figure 11 is a schematic graph illustrating the use of different shear matrices and the same wa ⁇ matrix for factorization
  • FIG. 1 a block diagram of an imaging system for implementing an embodiment of the present invention is shown generally by numeral 100
  • the system 100 includes a memory 102 for storing a dataset 104 compnsing a slice stack of the volume data, a processing system 106 responsive to a user input 108 for rendenng and displaying on a display 1 10 a view 112 of the dataset 104
  • the processing system performs a transformation by shear facto ⁇ zation to create a distorted image projection
  • distorted projection is not displayed
  • the projection undergoes a 2D geomet ⁇ c image wa ⁇ ing operation to create a final 2D- ⁇ mage projection of the volume
  • the method selects an axis to use m the factorization, by ensunng that it is the one with the biggest component in the viewing direction or major axis This is achieved by resampling the volume slices with a finer g ⁇ d along this major axis and then rendenng the image as above
  • FIG. 3(a) and 3(b) This may be illustrated geometncally by refernng to Figures 3(a) and 3(b), which also shows the various coordinate systems used m an imaging system according to an embodiment of the invention
  • the relationship between the va ⁇ ous coordinates systems is well known m the art, but is bnefly described herein for convenience
  • the coordinate systems includes a viewer 120 at a viewpoint V with a coordinate system defined by orthogonal axes (u,v,w), an image plane P 122 at a distance n along a viewing direction 124 and a far plane 126 a distance fin the volume 104
  • the volume 104 is comp ⁇ sed of a senes of slices (obtained through a CT scan, MRI, Ultrasonic or other techniques)
  • Another coordinate system having orthogonal axes (x,y,z) is defined with its origin at one corner of the volume 104, and the plane parallel to the slices and extending m the z direction as shown in figure 3(a)
  • the image plane 122 is shown as comp ⁇ sed of rows and columns of pixels (or voxels) with the point P at the center of the image plane
  • a cross-section 142 of the volume 126 as shown in figure 4(b) and a viewing direction projected onto the x-y plane is shown generally by numeral 140
  • the viewing direction is indicated by a vector D
  • shear-wa ⁇ facto ⁇ zation is achieved by transforming a volume into sheared object space wherein for parallel projection each slice is translated, while for perspective projection each slice is translated and scaled
  • the voxel slices can be projected into an image easily
  • m sheared object space all viewing rays are parallel to the third coordinate axis or also known as a major ax ⁇ s( the y axis as illustrated m figure 4(a))
  • the viewing direction D is determined by
  • the major axis is selected by using the axis that has the largest component in the viewing direction
  • a sample step size is calculated The volume is resampled with a higher resolution along the major axis
  • the step size indicates the extent of the resampling For example, a step size of six implies that the major axis will be sampled with a resolution increased by a factor of six
  • A. shear wa ⁇ transformation is applied to the volume in stack space, that is the stack of sliced 2-D images
  • the shear wa ⁇ transformation transforms the image to one that can be displayed
  • the matrices used in the transformation, M S hear and M war p, are defined m accordance with the paper "Fast Volume rendering of a Shear Wa ⁇ Facto ⁇ zation of the Viewing Transformation" by Philippe Lacroute and Marc Levoy, and the major direction is determined as described above
  • a flow chart illustrating the steps involved m the shear wa ⁇ process according to an embodiment of the invention is shown generally by numeral 150
  • D represents the viewing direction vector vectors r 0 and r represent two rays in the viewing frustum other than D
  • V represents the volume m stack space to be rendered
  • D (-0 2, 1)
  • r 0 (l,0 2)
  • r ⁇ (-l,0 2) Therefore, the largest component of the vector D is in the y direction and the y-axis is selected as the major axis
  • the vectors in shear space are:
  • the resampled volume Y' is related to the original volume Y by:
  • the vectors in shear space are:
  • the viewing angle may be required to trace rays parallel to all three axes. This results in at least one ray with a zero co-ordinate for each axis and therefore it is not possible to use the same factorization. Therefore, the limit that this imposes is that the viewing angle must be less than a 90° solid angle.
  • the sample step size r is calculated in the same manner as it was in the previous embodiment. Although, resampling the image improves the performance of the algorithm there is still an overhead associated with it. Therefore, if the amount of resampling to be done is minimized a further performance enhancement can be realized. As it can be seen in figure 7, some rays have the y-axis as the major axis while others have the x-axis as the major axis. Those rays that already have the y-axis as the major axis are not resampled. Therefore, if the resampling occurs only in the region that originally has the x-axis as the major axis, then not as many resources and not as much time will be used to perform the shear.
  • One aspect of this method is the fact that the addition of a resampling step does not change the wa ⁇ matrix.
  • the same wa ⁇ matrix can be used to transform the baseline image to the final image.
  • M v ,evv and M' v ⁇ e w are two matrices with different resampling steps and the resampling occurs in the x direction:
  • M 1 ⁇ i view M 1Ti narp M iTA shear
  • the wa ⁇ matrix is really a 2-D matrix since it is applied to the baseline image in shear space in which the slices have been composed.
  • the first row of this matrix can be compressed because the x-axis is used in the factorization:
  • M' warp 2D M warp 2D This property allows the use of different sample step sizes m shear space along the major axis

Abstract

The present invention relates to a method for generating a 2-D projection directly from a 3-D volume data, the method comprising the steps of determining a viewing direction vector in a viewing frustum, determining a major axis of the direction vector, resampling the volume data in the direction of the major axis, applying a shear factorization to the resampled data; and rendering the factorized data. The method provides a singularly warped image which avoids having to patch images from multiple warp functions which, in turn, improves the quality of the final image. Finally the image allows a scene to be rendered from within the scene itself. The invention can be applied to medical imaging and enable a surgeon to view an image such as a CT scan with perspective as well as from within the scan itself, providing the surgeon with an invaluable tool.

Description

PERSPECTIVE WITH SHEAR WARP
The present invention relates to the field of three-dimensional (3-D) image rendering, and more particularly to fast volume rendering using factorization.
BACKGROUND OF THE INVENTION
Real-time rendering of 3-D images in volume visualization applications has become increasingly important. This is particularly useful in clinical applications for the display and analysis of volumetric data sets acquired by imaging methods such as Computed Tomography (CT), Magnetic Resonance Imaging (MRI) or Ultrasonic imaging. Benefits of volume visualization include the ability to obtain oblique views for the increased understanding of complex geometric structures within organs, and the ability to measure volumes, areas, and distances. Volume visualization also provides the ability to explore the spatial relationship between an organ and its surrounding structures or tissues. In general, generation of 3-D images includes the steps of data acquisition, volume reconstruction, and image visualization. Volume rendering is a technique that generates a two-dimensional (2-D) projection directly from the 3-D volume data without requiring any intermediate geometrical data structure.
Unfortunately, the application of volume rendering to medical imaging and scientific visualization is limited because it is computationally expensive to generate acceptable frame rates. In order for rendering to be effective it is important that it is interactive, that is a user can make certain requests of the image and expect a real-time response. High quality images would take tens of seconds or even minutes to generate using typical workstation technology. Various approaches have been tried in an attempt to improve the response time. One approach is to trade quality for speed. This approach is ineffective because many perceptual components of the image are obscured from the view of the user. A more costly approach is to rely on special-purpose hardware, which drastically reduces the flexibility of the system. Another approach relies on brute force and simply attempts to render an image by utilizing large multiprocessor computers to perform the rendering in parallel. These types of computers, however, are very expensive and require a large number of processors to achieve acceptable frame rates.
A further approach is to provide better, faster volume rendering algorithms. Several existing volume-rendering algorithms operate by sampling the 2-D slices of the 3-D volume data (also known as factorization), at a fixed resolution; shearing the sampled slices to form an intermediate but distorted image; and warping the intermediate image in 2D to form an undistorted final image An example of such a technique is described m United States Patent No 5,787,889 titled "Ultrasound imaging with real-time 3d image reconstruction and visualization"
The image visualization process derives 2D image projections of the 3D image The stack of 2D images is known as the "stack space" A shear waφ factorization process is used to derive the new 2D projection for one or more video frames of the image For each change in viewing angle, the processor factonzes the necessary viewing transformation matπx into a 3D shear, which is parallel to slices of the volume data A projection of the shear forms a 2D intermediate image A 2D waφ is implemented to produce the final image, (that is, a 2D projection of the 3D volume at a desired viewing angle)
During the process of re-visuahzmg the volume at a different \ιewιng angle, the reconstructed volume is sheared by transforming the reference coordinate system to an intermediate coordinate system This simplifies mapping of the data The intermediate coordinate system also is referred to as "sheared object space" The sheared object space is selected so that all viewing rays are parallel to one of the axes of the oπginal coordinate system for the volume (e g , the reference coordinate system) Figure 1(a) depicts the transformation into sheared object space for parallel projection The volume is retrieved as a set 16 of volume slices The shearing direction for the volume is parallel to the set 16 of slices Alternatively, the set of slices is resampled to be parallel to the sheaπng direction The slices 16 then are translated and resampled to achieve image projection rays 19 which are peφendicular to the sheared slices 18 The intermediate image projection plane is parallel to the slices of the volume Since all the projection rays 19 are peφendicular to both the projection plane 14 and the slices 18, an imaging process for the projection causes the image data to be accessed in storage order Because the sheaπng occurs only on two axes, a simple translation operation is used which does not use a lot of computation The result of the shear factoπzation is a distorted image projection Such distorted projection is not displayed Before the volume is displayed, the projection undergoes a 2D geometπc image waφmg operation to create a final 2D image projection of the volume
Figure 1(b) illustrates perspective projection and is represented by the numeral 20 For a perspective transformation, the slices 16 are scaled m addition to being sheared to achieve sheared object space slices 24 The scaling produces an effect that allows an observer to perceive the volume with perspective Such a method is best descπbed in a paper titled " Fast Volume Rendenng Using a Shear-Waφ Factorization of the Viewing Transformation", Technical Report CSL-TR-95-678, Departments of Electrical Engineering and Computer Science, Stanford University, September 1995.
However, the current implementations of shear waφ factorization are limited in several aspects. First of all, the major axis is not unique for all the rays in the viewing frustum. As previously mentioned, the major axis is used for factorization. Therefore, since there are three coordinate axes, there are three possible major axes, and a scene may be rendered in as many as three different factorizations. Each rendering process produces an image patch and joining, or stitching, the patches together create the final scene.
It has been found that the stitching process is computationally not trivial and furthermore, some distortion is created along the border of the patches. The distortion occurs since the factorizations are only mathematical models and, therefore, the results may differ slightly between factorizations. Additionally, using two or more factorizations is expensive in terms of memory access and computation.
Furthermore, current implementations also do not allow the observer to view the image from a position within the data set; a position, which can provide valuable information to the observer.
It is an object of the present invention to obviate or mitigate at least some of the above mentioned disadvantages.
SUMMARY OF THE INVENTION
An advantage of the present invention is a method for generating a 2-D image with perspective using shear- waφ factorization that uses one copy of the volume data.
A further advantage of the invention is the volume data may be accessed in storage order and which has the further advantage of accessing memory once for every slice. In accordance with this invention there is provided a method for generating an 2-D projection directly from an 3-D volume data, the method comprising the steps of:
(a) determining a viewing direction vector in a viewing frustum;
(b) determining a major axis of the direction vector;
(c) resampling the volume data in the direction of the major axis; (d) applying a shear factorization to the resampled data; and
(e) rendering the factorized data. According to a further aspect of the present invention there is provided a system for generating in substantially realtime a 2-D image projection directly from a 3-D volume data in response to input from a user of the system. The system comprises: a) a memory for stoπng the volume data, b) a processor for factoπng and rendenng image data selected from the volume data, c) a user interface for providing the processor with an image parameter, the image parameter to be used in generating the image projection, d) a display for displaying the image projection provided by the processor, and e) a refined grid used by the processor for sampling the image data, wherein the image parameter facilitates determination of a resolution of the refined gπd In accordance with a further embodiment there is provided a system for generating an 2-D projection directly from an 3-D volume data, the system including a microprocessor programmed for a system for generating an 2-D image projection directly from a 3-D volume data, the system compnsing a microprocessor programmed for determining a viewing direction vector in a viewing frustum, determining a major axis of said viewing direction vector, re-sampling the volume data m a direction of said major axis, applying a shear factonzation to the re-sampled data, and rendenng the factorized data for producing the image proj ection
BRIEF DESCRIPTION OF THE DRAWINGS
These and other embodiments of the invention will now be described by way of example only, with reference to the accompanying drawings in which
Figure 1(a) is an illustration of shear waφ factonzation with parallel projection according to the pπor art,
Figure 1(b) is an illustration of shear waφ factonzation with perspective projection according to the prior art, Figure 2 is a schematic diagram of a volume rendenng system for implementing an embodiment of the present invention,
Figures 3(a) and (b) are schematic diagram showing the coordinate systems used in an embodiment of the present invention,
Figures 4(a) and (b) is a schematic diagram of a cross-section of a scene to be rendered and its associated viewing frustum,
Figure 5 is a flow chart illustrating the steps involved m the shear waφ process according to an embodiment of the invention,
Figure 6 is another schematic diagram of a cross-section of a scene to be rendered, Figure 7 is schematic diagram of the image m figure 6 after shearing,
Figure 8 is a schematic graph illustrating the mapping of two related points to the same result,
Figure 9 is a schematic diagram of the cross-section of the scene m figure 4(a) aftei resampling,
Figure 10 is a schematic diagram of the cross-section of the scene m figure 9 after sheaπng, and
Figure 11 is a schematic graph illustrating the use of different shear matrices and the same waφ matrix for factorization
DESCRIPTION OF THE PREFERRED EMBODIMENT
In the following descnption, like numerals refer to like structures in the drawings Refernng to figure 2, a block diagram of an imaging system for implementing an embodiment of the present invention is shown generally by numeral 100 The system 100 includes a memory 102 for storing a dataset 104 compnsing a slice stack of the volume data, a processing system 106 responsive to a user input 108 for rendenng and displaying on a display 1 10 a view 112 of the dataset 104
To reconstruct an image on the display in substantially realtime, m response to a user selected view of the dataset, the processing system performs a transformation by shear factoπzation to create a distorted image projection Such distorted projection is not displayed Before the volume is displayed, the projection undergoes a 2D geometπc image waφing operation to create a final 2D-ιmage projection of the volume Thus given a scene defined by a volume, a viewing position, a viewing direction and a viewing frustum, the method selects an axis to use m the factorization, by ensunng that it is the one with the biggest component in the viewing direction or major axis This is achieved by resampling the volume slices with a finer gπd along this major axis and then rendenng the image as above
This may be illustrated geometncally by refernng to Figures 3(a) and 3(b), which also shows the various coordinate systems used m an imaging system according to an embodiment of the invention The relationship between the vaπous coordinates systems is well known m the art, but is bnefly described herein for convenience The coordinate systems includes a viewer 120 at a viewpoint V with a coordinate system defined by orthogonal axes (u,v,w), an image plane P 122 at a distance n along a viewing direction 124 and a far plane 126 a distance fin the volume 104 The volume 104 is compπsed of a senes of slices (obtained through a CT scan, MRI, Ultrasonic or other techniques) Another coordinate system having orthogonal axes (x,y,z) is defined with its origin at one corner of the volume 104, and the
Figure imgf000007_0001
plane parallel to the slices and extending m the z direction as shown in figure 3(a) A viewing frustum 128 is shown having an angle α to the viewing direction 124 The entire system is referenced with respect to a world coordinate system 130 Thus, according to an embodiment of the present invention, refernng to Figure 3a,b data indicating the viewpoint V in the 3D workspace includes coordinate data indicating the viewpoint's position Data indicating the viewpoint m a 3D workspace may also include data indicating the viewpoint's "direction of orientation" in the workspace The direction of onentation is the direction from the viewpoint into the field of view along the axis at the center of the field of view Each viewpoint into a 3D workspace provides a view into the workspace that is delineated by a truncated pyramid structure called a viewing frustum, as shown m figure 3(a) As may be seen specifying two items specifies the viewing frustum a position of the user's eye and a position of a point m the workspace to be centered m the view A viewing transform automatically produces an axis defined by these two items, which is called a "line of sight " The "line of sight" is a ray cast from the user's eye through the center of the viewing frustum and produces an axis orthogonal to the image surface of the display
As may be seen in figure 3(b), the image plane 122 is shown as compπsed of rows and columns of pixels (or voxels) with the point P at the center of the image plane Referring to figure 4(a) a cross-section 142 of the volume 126 as shown in figure 4(b) and a viewing direction projected onto the x-y plane is shown generally by numeral 140 The viewing direction is indicated by a vector D As descπbed earlier, shear-waφ factoπzation is achieved by transforming a volume into sheared object space wherein for parallel projection each slice is translated, while for perspective projection each slice is translated and scaled In both instances in sheared object space the voxel slices can be projected into an image easily By definition, m sheared object space all viewing rays are parallel to the third coordinate axis or also known as a major axιs( the y axis as illustrated m figure 4(a)) Thus if there exists a viewing transformation matrix Mv,ew that transforms points ( repiesented as column vectors) from object space to image space, then Mvιen may be used to determine the viewing direction D
The viewing direction D is determined by
Figure imgf000007_0002
In the present invention the major axis is selected by using the axis that has the largest component in the viewing direction For example, the vector D m figure 4(a) has the coordinates x = 2, y =l Since the x component of D is greater than the y component, the x-axis is selected as the major axis
Once the major axis is selected, a sample step size is calculated The volume is resampled with a higher resolution along the major axis The step size indicates the extent of the resampling For example, a step size of six implies that the major axis will be sampled with a resolution increased by a factor of six
A. shear waφ transformation is applied to the volume in stack space, that is the stack of sliced 2-D images The shear waφ transformation transforms the image to one that can be displayed The matrices used in the transformation, MShear and Mwarp, are defined m accordance with the paper "Fast Volume rendering of a Shear Waφ Factoπzation of the Viewing Transformation" by Philippe Lacroute and Marc Levoy, and the major direction is determined as described above Thus, refernng to figure 5, a flow chart illustrating the steps involved m the shear waφ process according to an embodiment of the invention is shown generally by numeral 150
In some instances all the viewing rays in the viewing frustum are not guaranteed to have the same major axis and therefore it may not be possible to use only one factonzation
Refernng to Figure 6 a 2D example is illustrated in which all the viewing rays do not ha\ e the same major axis Once again D represents the viewing direction vector vectors r0 and r represent two rays in the viewing frustum other than D, and V represents the volume m stack space to be rendered In this particular example, D=(-0 2, 1), r0=(l,0 2), and rι=(-l,0 2) Therefore, the largest component of the vector D is in the y direction and the y-axis is selected as the major axis
However, the situation appears quite different in object space From Lacroute et Al above the general equation for a 2D-shear matnx MShear is given by
M shear
Figure imgf000008_0001
where s is the ratio of the component of the vector m the viewing direction that is not along the major axis to the component that is along the major axis Therefore s — -0 2 and the shear matnx is 1 0.2
M shea i 0 1
The vectors in shear space are:
D' = MshearD = (0,l ) rO = Mshear ro = (1.04, 0.2) r', = Mshear r, = (-0.96, 0.2)
The sheared image is shown in figure 7. It can be seen that r'0 cannot be calculated using the above factorization because y is not the major axis for this ray.
However, if the volume is resampled with a higher resolution along the major axis it is guaranteed that, under certain conditions, all the rays in the viewing frustum will have the same major axis. Therefore it is desirable to resample the image along the y-axis with resampling step size r. The resampled volume Y' is related to the original volume Y by:
Figure imgf000009_0001
As shown schematically in figure 7, Y is transformed to the final image O using the matrix MView. Therefore it is necessary to find the matrix Mv,ew' to transform Y' to the same O. It is known that:
0 = M' viewY' (1) and
O = Mvιm Y = M_ R RY = Mv e , R Y (2) By comparing equation 1 with equation 2 it can be determined that: m mχ3 m. 1 0 0 0λ mn mu I r mχ m. m2\ m22 «.-3 m 24 0 1 / r 0 0 m2 ] «„ I r W2-, m 24 M\ M. R-1 = m31 m 2 »z. w '"z3.4 0 0 1 0 m3 l m I r m33 m3
VW41 WJ42 m4ι m '"44 7 V 0 0 0 1 m41 m42 1 r 7?/ .. m 44
It is necessary to calculate the viewer position in stack space as:
Figure imgf000009_0002
since M' I ^M^ R-1 = Mvιeu R" Also
Figure imgf000010_0001
since Following the same rule e'0,z and e'0ΛV are:
Figure imgf000010_0002
Figure imgf000010_0003
If the viewer is located outside the volume, vectors are constructed from the viewer position to each one of the corners of the volume. These vectors represent the boundary of the viewing frustum and all the rays are included in it. If points p', for i = 0...6 are the corners of a volume in Y' then the major direction for each vector is defined as:
Figure imgf000010_0004
From equation 3 it can be determined that as long as
Figure imgf000010_0005
there exists an integer r that, when used as the sample step size, ensures all rays in the viewing frustum have the same major axis. Since all the rays in the frustum have the same major axis, they are all rendered with the same factorization. Therefore, the image displayed does not have the visual defects that may arise if the image is rendered with multiple factorizations and then patched together. The image is also rendered faster than if multiple factorizations are used.
Referring once again to the 2D scene in figure 6, it is possible to resample the scene along the y-axis. If it is determined that the value for r is 6, then the scene is resampled with a step size of 6 and the results are shown in figure 9. The new values for the vectors are: D = (-0.2, 6): r0 = (1,1.2); and ri = (-1,1.2). Here s = - 0.2/6 and the shear matrix is:
Figure imgf000011_0001
The vectors in shear space are:
D'=MshearD=(0,6); r'o=Mshe_r ro=(- -004, 1.2);
Figure imgf000011_0002
The sheared image is shown in figure 10. At this point, it is possible to see that in object space, the y-axis is the major axis for all the vectors and therefore only one factorization will be necessary. If the position of the viewer is located within the volume, a different procedure is used to determine the value for r. In this case it is necessary to form a ray to each of the voxels in the viewing frustum. If v,' is a voxel in the viewing frustum then the vector from the observer to this voxel is v'j-e'0. The major direction is determined by: f = maxψ'', -e'0 ϊ|,|v', ,
Figure imgf000011_0003
(5) In this situation, however, there is no guarantee that there can be a value for r such that
|vl l - eβ.. | ≠ 0 Vz (6)
In fact, if the viewing angle is wide enough, it may be required to trace rays parallel to all three axes. This results in at least one ray with a zero co-ordinate for each axis and therefore it is not possible to use the same factorization. Therefore, the limit that this imposes is that the viewing angle must be less than a 90° solid angle.
It is important to note that although the case for resampling along the y-axis has been explained, it can be shown in a very similar manner for all the other axes.
In the case in which the observer is outside the stack, one axis that satisfies equation (4) can always be found. In a second embodiment, it is possible to improve the performance of the algorithm.
The sample step size r is calculated in the same manner as it was in the previous embodiment. Although, resampling the image improves the performance of the algorithm there is still an overhead associated with it. Therefore, if the amount of resampling to be done is minimized a further performance enhancement can be realized. As it can be seen in figure 7, some rays have the y-axis as the major axis while others have the x-axis as the major axis. Those rays that already have the y-axis as the major axis are not resampled. Therefore, if the resampling occurs only in the region that originally has the x-axis as the major axis, then not as many resources and not as much time will be used to perform the shear.
One aspect of this method is the fact that the addition of a resampling step does not change the waφ matrix. As is illustrated schematically in figure 11 , although two different shear matrices are used to transform the volume to a baseline image, the same waφ matrix can be used to transform the baseline image to the final image.
If Mv,evv and M'vιew are two matrices with different resampling steps and the resampling occurs in the x direction:
M 1Ψi view = M 1Ti narp M iTA shear
Figure imgf000012_0001
and
M 1TX' vim = M 1Ti warp M L l shear R1*-"1 =
V
Figure imgf000012_0002
W, w12 wv wv Mr 0 0 wu I r wv I r wu I r w] 4 / r 0 0 0
W 21 W22 W. 23 w 24 - syl r 1 0 w21 w22 w23 W- sv I r 1 0 0
W' W 3. 2 w, w. - szl r 0 1 w. w32 w. w 34 - 5z/r 0 1 0
VW4, w42 wv w 44 Λ plr 0 0
Figure imgf000012_0003
V W41 W42 w w 4, 4 71 - pl r 0 0 1 ,
= M\ warp M' shear
The waφ matrix is really a 2-D matrix since it is applied to the baseline image in shear space in which the slices have been composed. The first row of this matrix can be compressed because the x-axis is used in the factorization:
Figure imgf000012_0004
and M'warp 2D =Mwarp 2D This property allows the use of different sample step sizes m shear space along the major axis
Since the waφ matnx to be used is the same regardless of the value of r, and all the major axes are the same, there is still only one factorization and the image will not have to be patched together Also, as previously mentioned, the speed of the algoπthm is a further improvement to the previous embodiment
As in the previous embodiment, it is important to note that although the case for resampling along the x-axis has been explained, it can be shown in a very similar manner for all the other axes
Also, the examples given for all embodiments have related to 2D scenes for simplicity only In reality, the scenes that are rendered are 3D and the methods described herein are simply extended to the 3D application
Although the invention has been descπbed with reference to certain specific embodiments, various modifications thereof will be apparent to those skilled in the art without departing from the spiπt and scope of the invention as outlined in the claims appended hereto

Claims

CLAIMS:
1. A method for generating a 2-D image projection directly from a 3-D volume data, the method comprising the steps of:
a) determining the viewing direction vector in a viewing frustum; b) determining a major axis of said viewing direction vector; c) re-sampling the volume data with a refined grid in a direction of said major axis; d) applying a shear factorization to the re-sampled data; and e) rendering the factorized data for producing said 2-D image projection.
2. A method according to claim 1 further comprising the step of determining said viewing frustum by specifying positional data of a viewing point and a central point, said central point to be substantially centered in said 2-D projection.
3. The method according to claim 2, wherein said positional data of said viewing point further includes direction of orientation data of said viewing point in said volume data.
4. The method according to claim 2, wherein said viewing frustum is a truncated pyramidal geometrical structure.
5. The method according to claim 4 further comprising the step of determining boundaries of said viewing frustum by constructing boundary vectors, each of the boundary vectors including said viewing point and extending to each respective corner of said volume data, wherein said viewing point is positioned external to the positions contained by said volume data.
6. The method according to claim 5 further comprising the step of employing geometrical information obtained from said boundary vectors for determining a step size used to select a resolution of said refined grid.
The method according to claim 1 further comprising the step of using the ma oi component of said viewing direction vector for determining said major axis
The method according to claim 7 further comprising the step of selecting a step size for detemimmg a resolution of said refined gnd
The method according to claim 8, wherein said step size is used for providing a plurality of viewing vectors in said viewing frustum having the same said major axis in an object viewing space as said viewing direction vector
The method according to claim 9, wherein said viewing direction \ ector contains a viewing point and a central point, said central point is substantially centered in said 2- D image projection
The method according to claim 8, wherein a plurality of viewing vectors in said viewing frustum are rendered with the same factoπzation matnx
The method according to claim 9 further compnsing the step of using one copy of said volume data for applying said shear factoπzation
The method according to claim 12 further compnsing the step of accessing said volume data in a pre-defined storage order
The method according to claim 13, wherein said volume data is stored m a memory in a stack of 2-D image slices
The method according to claim 14 further compnsing the step of accessing said memory once for every selected one of the image slices in said stack
The method according to claim 2 further compnsing the step of constructing a senes of separate viewing vectors containing said viewing point and each said separate viewing vectors extending to a respective one of a plurality of voxels contained in said viewing frustum, wherein a position of said viewing point is located withm said volume data
17. The method according to claim 16 further comprising the step of selecting a step size for determining a resolution of said refined grid, wherein a viewing angle contained by said viewing frustum and said viewing direction vector is less than 90°.
18. The method according to claim 1 further comprising the step of restricting the resampling step to selected ones of a plurality of viewing vectors in said viewing frustum, said selected ones having a preliminary major axis different from said major axis of said viewing direction vector.
19. A method according to claims 1 through 18, wherein the rendering of said factorized data produces a 3-D image.
20. A system for generating an 2-D image projection directly from a 3-D volume data, the system comprising a microprocessor programmed for: determining a viewing direction vector in a viewing frustum; determining a major axis of said viewing direction vector; re-sampling the volume data in a direction of said major axis; applying a shear factorization to the re-sampled data; and rendering the factorized data for producing the image projection.
21. A system for generating in substantially realtime fashion in response to input from a user a 2-D image projection directly from a 3-D volume data, the system comprising:
a) a memory for storing the volume data; b) a processor for factorizing and rendering an image data set selected from said volume data; c) a user interface for providing said processor with an image parameter, said image parameter to be used in generating the image projection; d) a display for displaying said image projection provided by said processor; and e) a refined grid used by said processor for re-sampling said image data set, wherein said image parameter facilitates determination of a resolution of said refined grid. A system according to claim 21 , wherein said image parameter includes a viewing point and a central point, said central point is substantially centered m said 2-D image proj ection
The system according to claim 22, wherein said viewing point and said central point are used for determining a viewing direction for said 2-D image projection
The system according to claim 23, wherein the major component of said viewing direction vector is used for determining a major axis in a sheared object space, said refined grid is applied along said major axis
The system according to claim 21, wherein said volume data is stored in said memory as a stack of 2-D image slices
The system according to claim 25, wherein said processor accesses said memory once for every selected one of said 2-D image slices
The system according to claim 26, wherein said processor accesses said memory in a predefined storage order
The system according to claim 21, wherein said image parameter is used for determining a resolution step size of said refined grid
The system according to claim 25, wherein said stack of 2-D image slices is obtained from an imaging system selected from the group compnsing CT, MRI, and
Ultrasound
The system according to claims 21 to 29, wherein the processing of said image dataset produces a 3-D image
PCT/CA2000/001184 1999-10-15 2000-10-13 Perspective with shear warp WO2001029772A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
AU77659/00A AU7765900A (en) 1999-10-15 2000-10-13 Perspective with shear warp
US10/122,148 US20030012419A1 (en) 1999-10-15 2002-04-15 Perspective with shear warp
US10/792,126 US7031505B2 (en) 1999-10-15 2004-03-04 Perspective with shear warp

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CA2,286,447 1999-10-15
CA002286447A CA2286447C (en) 1999-10-15 1999-10-15 Perspective with shear warp

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US10/122,148 Continuation US20030012419A1 (en) 1999-10-15 2002-04-15 Perspective with shear warp

Publications (1)

Publication Number Publication Date
WO2001029772A1 true WO2001029772A1 (en) 2001-04-26

Family

ID=4164391

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CA2000/001184 WO2001029772A1 (en) 1999-10-15 2000-10-13 Perspective with shear warp

Country Status (4)

Country Link
US (2) US20030012419A1 (en)
AU (1) AU7765900A (en)
CA (1) CA2286447C (en)
WO (1) WO2001029772A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1290898A1 (en) 2000-05-03 2003-03-12 Koninklijke Philips Electronics N.V. Autostereoscopic display driver
JPWO2012063653A1 (en) * 2010-11-12 2014-05-12 株式会社日立メディコ Medical image display device and medical image display method

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6570952B2 (en) * 2001-02-27 2003-05-27 Siemens Corporate Research, Inc. Memory efficient shear-warp voxel projection algorithm
US7561725B2 (en) * 2003-03-12 2009-07-14 Siemens Medical Solutions Usa, Inc. Image segmentation in a three-dimensional environment
DE102005023167B4 (en) * 2005-05-19 2008-01-03 Siemens Ag Method and device for registering 2D projection images relative to a 3D image data set
US7978191B2 (en) 2007-09-24 2011-07-12 Dolphin Imaging Systems, Llc System and method for locating anatomies of interest in a 3D volume
GB201003065D0 (en) * 2010-02-23 2010-04-07 Simpleware Ltd Image processing method and method of three-dimensional printing incorporating the same
US9728001B2 (en) * 2011-09-23 2017-08-08 Real-Scan, Inc. Processing and rendering of large image files
GB2515510B (en) 2013-06-25 2019-12-25 Synopsys Inc Image processing method
CN107209957B (en) * 2015-01-30 2021-04-02 惠普发展公司有限责任合伙企业 Generating slice data from a voxel representation
US10379611B2 (en) * 2016-09-16 2019-08-13 Intel Corporation Virtual reality/augmented reality apparatus and method

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5803082A (en) * 1993-11-09 1998-09-08 Staplevision Inc. Omnispectramammography
EP0780010A4 (en) * 1994-09-06 1997-11-12 Univ New York State Res Found Apparatus and method for real-time volume visualization
US6002738A (en) * 1995-07-07 1999-12-14 Silicon Graphics, Inc. System and method of performing tomographic reconstruction and volume rendering using texture mapping
US6064393A (en) * 1995-08-04 2000-05-16 Microsoft Corporation Method for measuring the fidelity of warped image layer approximations in a real-time graphics rendering pipeline
US6009212A (en) * 1996-07-10 1999-12-28 Washington University Method and apparatus for image registration
US5956418A (en) * 1996-12-10 1999-09-21 Medsim Ltd. Method of mosaicing ultrasonic volumes for visual simulation
US5787889A (en) * 1996-12-18 1998-08-04 University Of Washington Ultrasound imaging with real time 3D image reconstruction and visualization
US6167297A (en) * 1999-05-05 2000-12-26 Benaron; David A. Detecting, localizing, and targeting internal sites in vivo using optical contrast agents
US6330356B1 (en) * 1999-09-29 2001-12-11 Rockwell Science Center Llc Dynamic visual registration of a 3-D object with a graphical model

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
LACROUTE P: "REAL-TIME VOLUME RENDERING ON SHARED MEMORY MULTIPROCESSORS USING THE SHEAR-WARP FACTORIZATION", PARALLEL RENDERING SYMPOSIUM (PRS),US,NEW YORK, ACM, 30 October 1995 (1995-10-30), pages 15 - 22, XP000598693, ISBN: 0-89791-774-1 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1290898A1 (en) 2000-05-03 2003-03-12 Koninklijke Philips Electronics N.V. Autostereoscopic display driver
JPWO2012063653A1 (en) * 2010-11-12 2014-05-12 株式会社日立メディコ Medical image display device and medical image display method

Also Published As

Publication number Publication date
US20030012419A1 (en) 2003-01-16
CA2286447C (en) 2009-01-06
CA2286447A1 (en) 2001-04-15
US20040170311A1 (en) 2004-09-02
US7031505B2 (en) 2006-04-18
AU7765900A (en) 2001-04-30

Similar Documents

Publication Publication Date Title
US11361479B2 (en) Enhancements for displaying and viewing tomosynthesis images
US5079699A (en) Quick three-dimensional display
US5313567A (en) Arrangement for determining and displaying volumetric data in an imaging system
EP0791894B1 (en) System and method for displaying oblique cut planes within the interior region of a solid object
US7460117B2 (en) Sliding texture volume rendering
US6697067B1 (en) Method and system for storing information regarding a selected view of a three dimensional image generated from a multi-frame object
Wilson et al. Direct volume rendering via 3D textures
US7505037B2 (en) Direct volume rendering of 4D deformable volume images
Dai et al. Real-time visualized freehand 3D ultrasound reconstruction based on GPU
US20060239540A1 (en) Methods and systems for creating 4D images using multiple 2D images acquired in real-time ("4D ultrasound")
US4885688A (en) Minimization of directed points generated in three-dimensional dividing cubes method
US20100208968A1 (en) Occlusion Reduction and Magnification for Multidimensional Data Presentations
US20050237336A1 (en) Method and system for multi-object volumetric data visualization
Chen et al. Manipulation, display, and analysis of three-dimensional biological images
US7031505B2 (en) Perspective with shear warp
KR100420791B1 (en) Method for generating 3-dimensional volume-section combination image
EP0907148B1 (en) Computer graphics hardware for lighting effects
Edwards et al. Interactive three‐dimensional ultrasound using a programmable multimedia processor
Kim et al. Binary volume rendering using Slice-based Binary Shell
Barrett et al. A low-cost PC-based image workstation for dynamic interactive display of three-dimensional anatomy
Edwards et al. PC-based workstation for three-dimensional visualization of ultrasound images
Liu et al. Viewing 3D MRI data in perspective
McMillan Image-based rendering using image warping
Rhodes et al. Simple 3-D Image Synthesis Techniques From Serial Planes
Webb et al. Real-time ‘computer-evaded’pseudo-holography of computed tomography images

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AL AM AT AU AZ BA BB BG BR BY CA CH CN CR CU CZ DE DK DM EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
WWE Wipo information: entry into national phase

Ref document number: 10122148

Country of ref document: US

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP