Speaker
Description
Iron-oxide nanoparticles are used as negative contrast agents in Magnetic Resonance Imaging (MRI)1. Their great magnetization induces magnetic inhomogeneities which shorten the relaxation times. Their efficiency can be quantified by their relaxivities, i.e. their relaxation rates normalized by the iron concentration.
Cells loaded by iron oxide nanoparticles are commonly used to track tumor cells in vivo: cells are labelled in vitro and then injected to the animal. MR images allow following these cells in a non-invasive way and should ideally lead to their quantification in vivo. Unfortunately, when particles are internalized in cells, their associated relaxation times decrease – inducing a less efficient contrast on the MR image and making the labelled cells more complicated to interpret or to quantify2. An accurate model predicting the relaxivities associated to cells loaded by iron-oxide nanoparticles would make their quantification feasible.
The decrease of the relaxation times is usually attributed to the nanoparticle agglomeration, which is well known to modify the relaxation rate3. Water exchange seems also to influence the contrast4. However, no complete study has been made to identify the impact of these mechanisms on the relaxation times. In this work, we present experimental and simulation NMR results on samples containing iron-oxide nanoparticle loaded cells suspended in an agarose gel. Nuclear Magnetic Relaxation Dispersion measurements were performed for different iron or cell concentrations. Simulations reproducing the transverse relaxation at high magnetic field and modelling the labelled cells suspended in gel were also performed and are compared to experiments.