• Non ci sono risultati.

4D Tomographic Image Reconstruction and Parametric Maps Estimation: a model-based strategy for algorithm design using Bayesian inference in Probabilistic Graphical Models

N/A
N/A
Protected

Academic year: 2021

Condividi "4D Tomographic Image Reconstruction and Parametric Maps Estimation: a model-based strategy for algorithm design using Bayesian inference in Probabilistic Graphical Models"

Copied!
3
0
0

Testo completo

(1)

UNIVERSIT`

A DI PISA

DIPARTIMENTO DI INGEGNERIA DELL’INFORMAZIONE

Dottorato di Ricerca in Ingegneria dell’Informazione

Phd Thesis summary by the Student: Michele SCIPIONI

Ph.D. Program, cycle XXXI

-Tutor(s): Prof. Luigi Landini, Maria Filomena Santarelli, Ph.D.

This work is inspired by the search for an answer to two grand challenges affecting 4D emission tomog-raphy, namely the solution of the inverse problem of computing the rate of emission in the imaging volume in case of an extreme photon-limited regime, and the estimation of maps of pharmacokinetic parameters. The strategy to tackle these issues proposed in this thesis is based on the idea that a unified and synergistic approach to the estimation of both dynamic activity time series and parametric maps could provide mutual benefits, by integrating the lack of measured information with prediction made by the chosen model.

Framing emission tomography imaging in the Bayesian framework via probabilistic graphical models, we are able to define a model-based approach to the design of integrated inference algorithms. From one side, this modeling approach has shown itself able to encompass traditional literature about emission tomography image reconstruction. From another, it provides a flexible tool to describe causal relationships between variables, and a straightforward strategy to derive inference algorithms from such combination of graphical and probabilistic representations.

A number of different models are proposed, justified and discussed, in the light of the model-based inference framework presented in this thesis. A comprehensive description of the phenomenon of image formation allows us to devise unified inference approaches to tackle at once and in a synergistic way the solution of multiple problems that traditionally are dealt with in a sequential way.

The formulations presented in this thesis are unifying in several ways, combining in a single model information from multiple domains, and attempting to unify reconstruction and kinetic modeling, task usually addressed with a sequential approach. Moreover, this modeling approach is able to abstract over details that are specific of a certain imaging modality in such a way that the inference strategies developed for PET can be (quite) easily adapted to other imaging modalities that may face similar challenges (like the case of DCE-MRI discussed in this work), requiring just minor changes of the assumptions made during model-design.

PART I: Positron Emission Tomography: a brief state of the art

Part I of this thesis is mostly introductory and aimed at building a common nomenclature and a background of knowledge based on (historic or recent) literature. Chapter 2 introduces basic concepts of the physics of emission tomography, giving an overall idea of how the interaction between injected radiotracer and biological tissues produces an emission of radiation that can be detected and recorded by the hardware of the PET scanner. Chapter 3 discusses the historical differences between analytic and iterative reconstruc-tion, with a specific focus on the most used reconstruction algorithms based on a statistical (frequentist) description of the problem. Chapter 4 presents the idea of dynamic imaging, and the main challenges related to dealing with 4D emission tomography scans. The second part of this chapter is then focused on providing a brief introduction to kinetic modeling (with a specific focus on compartmental models) that will be used a lot in the following chapters. Chapter 5 closes this first part by introducing the concept of direct parametric image reconstruction, discussing a few of the attempts made in the last 10-15 years to tackle the problem of estimating parametric maps directly from projection measurements, avoiding a preliminary reconstruction step.

(2)

-PART II: A model-based approach to tomographic image reconstruction: problem representation and inference for static and dynamic data

Part II constitutes the main body of the thesis, and it is entirely dedicated to the description of the model-based strategy for the design of generative model for emission tomography, and the derivation of inference algorithms. Chapter 6 opens this part by introducing the concept of model-based machine learning, as an approach to problem-solving based on the separation between the phases of model-design and inference. Key-concepts of probabilistic graphical modeling theory are provided, as the tool of choice that allows us to frame emission tomography imaging in the context of Bayesian inference: conventions used in model-design and inference are explained and justified, to be used as rational for the following chapters.

Chapter 7 specializes the mathematical formalism of probabilistic graphical models to the case of tomographic reconstruction problem. A key feature of the proposed modeling framework is to be able to encompass the current state of the art. Therefore, in this chapter we present a simple, 2-nodes, directed graphical model describing the causal relationship between activity image (x) and projection counts (y) via conditional probabilities. By adopting the common assumption of p(y|x) following a Poisson distribution, this model naturally guides us to the derivation of conventional reconstruction methods, like MLEM or OSL-MAPEM. Such derivations constitute the building blocks for the successive chapters.

Moreover, the decoupling between model design and inference, typical of model-based machine learning, creates a level of abstraction that allowed us to easily test new hypothesis and to efficiently derive suitable inference algorithms, opening up new interesting perspectives. To give a first example of this flexibility, Chapter 8 starts by discussing the possibility that artifact corrections may cause deviation from Poisson statistics in the measured projection counts. Adapting the PGM model to this condition is a matter of changing the interpretation of one of the nodes in the graph y , which now represents pre-corrected sinogram counts, and the definition of p(y|x), for which we identified the Negative Binomial distribution as a good candidate to model possibly over-dispersed count data. The proposed model-based framework is used as a tool to derive a novel reconstruction methods (NB-MLEM), finding proof of meaningful noise reduction in this specific scenario.

Chapter 9 discusses the first extension of the graphical model used in previous chapters to account for dynamic data. The inference strategy derived for this model is just a trivial extension of what seen in Chapter 7, with the estimation of time frame images still not exploiting temporal and kinetic knowledge coming from parametric modeling, which is still performed as a post reconstruction step. The second part of this chapter focuses on presenting solutions to address the two main problems related to this indirect approach to parametric maps estimation: long time, and high noise. We tackle the problem of long execution times in two ways. First, we discuss an approach to reduce the time-to-fit of a single voxel’s TAC, by avoiding numeric integration of ODE systems and convolutions for which it is possible to provide an analytic solution based only on sums and products. This trick allows a speed-up factor between 40x and 100x (depending on alternative type of implementations it was compared to). The low SNR in estimated maps is mostly due to an estimation strategy in which each voxel is fitted independently from all the others. This, coupled with the complex mathematical form typical of kinetic models and the noise in single-voxel TACs, causes the fitting of voxels close in space to converge to different local minima, and parametric maps to shown great short-range variability. To tackle this issue, we present a model-based description of the problem of voxelwise parametric maps estimation from an already-reconstructed 4D activity image. The inference over this model results in the derivation of a OSL-MAP version of the standard LM algorithm for nonlinear least squares optimization, in which a spatial prior may help neighboring voxels to converge to similar solutions. In particular, we propose to use an L1-norm Huber prior, which shows a positive impact on kinetic maps’ bias-to-noise ratio. The computation of such prior, however, requires us to be able to perform a parallel and synchronized update of the parameter estimate for all voxels in our volume. Therefore, we present a GPU library developed in Python+CUDA, and focused on combining the benefits of the analytic model form of compartmental models, and the model-based penalized estimation previously derived. As an additional benefit, porting the problem of voxelwise fitting to GPU produced an additional speed-up of 300x, allowing us to produce parametric maps from 4D emission images in less then 10 seconds.

Main focus of this work is, however, to explore the richness of information coming from dynamic PET studies, and to integrate knowledge acquired via kinetic modeling of such time courses into the reconstruction. In Chapter 10 we describe two different models trying to achieve this result. At first, we

(3)

-propose a probabilistic graph based on the idea of modeling a direct causal relationship between kinetic parameters and projection measures: . This model follows the controlling idea behind direct parametric estimation methods, and uses a deterministic relationship between parameter space θ and image space x to avoid (or ’hide’) the image reconstruction step. The inference over the graph resulted in a novel reconstruction strategy based on a combination of EM and Iterated Conditional Modes (ICM), and able to account also for a prior distribution defined in the kinetic parameter domain. The full potential of the PGM-based framework is then explored in Section 10.2, where the deterministic description of voxels’ TACs is replaced by the key assumption that activity time course x would be subject to uncertainty even if the parameters θ of the underlying dynamic process were known. This allows us to model all the causal relationship between random variables in the graph as conditional probabilities linked together by the chain rule. The inference of the complete joint distribution p(x, y, θ) = p(θ)p(x|θ)p(y|x) can be tackled using a new gradient based algorithm for kinetic-informed image reconstruction, presenting several advantages compared to existing methods: it is simpler to implement; it enables the inclusion of arbitrary (sub)differentiable priors for the parametric maps; and it is flexible to an arbitrary choice of the (linear or nonlinear) kinetic model.

Chapter 11 follows in the footsteps of Chapter 8 in providing a showcase of the abstraction power of the model-based inference framework presented and used throughout this thesis. Here, we discuss an adaptation of the model of 4D imaging designed for emission tomography, to be used in the context of DCE-MRI. Despite many differences in the physics and biology of the phenomenon, we are still dealing with 4D raw data acquired after the injection of a contrast agent, whose concentration over time may be described by specific kinetic models. Abstracting the imaging problem means changing the assumptions about the interpretation of node y , which now represents the measured k-space, and p(y|x), for which we can use a Gaussian likelihood to model Rician MRI noise. More than being just an exercise in style, we showed how the resulting inference algorithm kept the benefits of a traditional TV penalty of removing time-incoherent aliasing artifacts, while significantly improving temporal fidelity of the final result.

At last, in Chapter 12, after having constructed a model of what happens during a 4D tomographic PET scan, we decide to extend this model to account for additional elements (i.e. assumptions) which may provide a greater insight during the inference phase. In particular, we encodes in a new model the assumption that the imaged volume is formed by a finite number of different tissues (z), and that voxels’ specific kinetic behavior results from their belonging to a specific functional cluster. This is a rather more complex model, , in which the activity in a voxel is modeled as a mixture of Normal distributions with as many components as there are tissues, and its expected value is a weighted sum of the activity predicted by the kinetic model applied to each cluster mean. The resulting inference algorithm provides multiple advantages in terms of speed (applying kinetic modeling just to mean tissues’ TACs, not voxelwise), SNR (within-cluster edge-preserving smoothing), and eventually details-recovering (the speed-up allows to test different models for each tissue, letting us choose the best one, with the possibility of using multiple different kinetic models while studying a single volume).

PART III: Major contributions and future directions

This last part of the thesis summarizes the main results achieved during this Ph.D. and briefly describes Occiput.io, an open source GPU-based software for dynamic and multi-modal tomographic image recon-struction, which implements the algorithms described in the previous chapters.

Michele SCIPIONI Pisa, June 18, 2019

Riferimenti

Documenti correlati

[r]

Regardless, the fast winds – which are the ones most likely to affect their host galaxy due to their large velocity and therefore the most interesting in terms of

This presentation has received funding from the European Union's Horizon 2020 research and innovation programme under grant. agreement No

Alternatively, modular variability in the presence of a SMBH pair could be caused by relativistic Doppler-boosts of the emission produced in mini-disks bounded to individual SMBHs

For instance, some websites offer recipes in English, Italian, German or Hindi language; for the notion of recipe instructions used to describe the steps to be followed to prepare

Ora ne abbiamo la certezza, anche questa profezia sulla fine del mondo si è rivelata una grande bufala. “Per millenni la società segreta conosciuta come Covenant si è preparata per

I meccanismi a collasso sono stati studiati in entrambe le condizioni ambientali, ordinarie e durante un incendio, utilizzando un approccio cinematicamente ammissibile

In the absence of any physicians’ manipulation of the exact timing of deliveries that alters the almost uniform nature's distribution of births, we expect delivery time to be