340 likes | 935 Views
Maurizio Conti, Siemens Molecular Imaging, Knoxville, Tennessee, USA. Introduction to PET reconstruction. Summary. Emission data corrections Analytical methods (FBP) Iterative methods (MLEM, OSEM) Point Spread Function (PSF) Time-of-Flight (TOF). Emission Sinogram.
E N D
Maurizio Conti, Siemens Molecular Imaging, Knoxville, Tennessee, USA Introduction to PET reconstruction
Summary Emission data corrections Analytical methods (FBP) Iterative methods (MLEM, OSEM) Point Spread Function (PSF) Time-of-Flight (TOF)
Emission Sinogram 0° 45° 90°
A C B A Attenuation D C B Scatter D Need to correct the data A C B Normalization D
m-map Emission Sinogram Normalization Scattersimulation Scatter correction Attenuation correction
E N N x E recon X = A A x (N x E) recon = X S A x (NxE) - S recon _ =
Analytical: “exact” solution to a system of equations. For example: Filtered Back Projection, in image or Fourier space. Iteratives: iterative process in which an intermediate image converges toward the “true” image. For example: - Maximum Likelihood Expectation Maximization (MLEM), - Ordered Subsets Expectation Maximization (OSEM), - Row Action Maximum Likelihood Algorithm (RAMLA) . Reconstruction
p(xr, ) p(xr, ) = dyr f(x,y) = R {f(x,y)} xr f(x,y) = R -1{p(xr, )} =d b(xr, ) Reconstruction from projections Radon (1917): Backprojection operator
Filtered back projection in image space Apply a filter function to the projections before back projection
Filtered back projection in Fourier space Central slice (or central section) theorem
Filtered back projection in Fourier space Fourier transform (1D) of a projection filter the projection in frequency space repeat for all anglesj Interpolate on the rectangular grid Inverse Fourier transform (2D)
Filtered back projection in Fourier space : filters • The ramp filter is used to eliminate the star artifact and improve spatial resolution, but it also amplifies the noise (high frequencies). • To compensate for these effects, low-pass smoothing filters are applied to cutoff frequencies higher than a certain limit. Ramp filter Additional filter
The Maximum Likelihood Expectation Maximization (MLEM) is based on the maximization of the logarithm of a Poisson-likelihood probability function. It updates the image during each iteration by using a multiplicative factor assessed as the ratio between the original acquired projections and the newly estimated ones. Advantages of this iterative method are low noise, good spatial resolution, and the fact that all reconstructed values will be positive because a nonnegativity condition is imposed on the original data. The main disadvantage is the large number of iterations required to converge to an optimal solution. To overcome the problem of slow convergence rate, the ordered-subsets expectation maximization (OSEM) algorithm was proposed in 1994. The OSEM is a modified version of MLEM with the main difference being that projections are grouped into subsets of angles. Within each iteration the image is updated as many times as the number of subsets, proportionally accelerating convergence. Iterative methods
MLEM and OSEM Yj li MLEM Which l maximizes the L(l) ? OSEM
MLEM: L.A.Shepp, Y.Vardi, Maximum Likelihood reconstruction for emission tomography (maximum likelihood expectation maximization) IEEE Trans.Med. Imaging, vol. 1, no.2, 1982, 113-122. MLEM: K.Lange, R.Carson, EM reconstruction algorithms for emission and transmission tomography, Journal of Computer Assisted Tomography, vol.8, no.2, 1984, 306-316. OSEM: H.Malcolm and R.Larkin, Accelerated Image reconstruction using Ordered Subsets of projection data (ordered subsets expectation maximization) IEEE Trans. Nucl. Sci., vol. 13, no.8, 1994, 601-609. MLEM and OSEM, references
BackProjector ForwardProjector scatter correction randoms correction normalization correction attenuation correction MLEM, with physics corrections
2D or 3D reconstruction 2D reconstruction is fasterRebinning techniques move 3D data into 2D dataNew fast algorithms and computer make rebinning less necessary!
Recent reconstruction improvements Iterative algorithm (OSEM) : rebin 3D2D + OSEM2D OSEM3D OSEM3D + Point Spread Function (PSF) OSEM3D + Point Spread Function + Time-of-Flight (TOF) OSEM-3D + PSF + TOF OSEM-3D + PSF OSEM-3D rebin 3D2D + OSEM-2D
When a photon strikes a crystal, it travels a certain distance before its energy is converted into light. If the photon comes from the center of the field of view (FOV), the line of response (LOR) is likely to be correctly localized in the crystal in which the photon entered. The further away from the center of the FOV, the less likely the LOR will be calculated correctly because the photon will hit the crystal on an angle and continue traveling to another crystal before it lights up. Point Spread Function (PSF)
Point Spread Function (PSF) A Point Spread Function (PSF) describes the response of an imaging system to a point source or point object. A system that knows the response of a point source from everywhere in its field of view can use this information to recover the original shape and form of imaged objects. PSFs are used in precision imaging instruments, such as microscopy, ophthalmology, and astronomy (e.g. the Hubble telescope) to make geometric corrections to the final image.
Conventional PET detects coincidence photons and records individual lines of response (LOR) between the crystals. The actual location where the annihilation occurred along the LOR is not measured. Siemens ultraHD•PET with Time of Flight (TOF) measures the actual time difference between the detection of each coincidence photon. This timing information is used to better localize the event along each LOR. Time-of-Flight (TOF)
TOF systems are able to record segments of response instead of lines of response. The time resolution defines the size of the segment of response (“time bin”). Conventional line of response 0 x Segments of response (“time bins”) s s => directly proportional to the system’s time resolution Time-of-Flight (TOF) Time of Flight (TOF) systems measure the time between each coincidence photon to determine the event location along the line of response. The event location accuracy can be measured proportionally to the system’s time resolution. T2 T1 c = speed of light
Future: faster implementations of the reconstruction algorithms other algorithms development (other than ML based) listmode instead of sinogram based dynamic or 4D reconstructions application oriented reconstruction methods (oncology, brain functions, cardiac,..)