Characterization of PET image using global and local entropy
DOI:
https://doi.org/10.15392/bjrs.v3i1A.128Keywords:
PET, ROI, AlgorithmsAbstract
In the clinical practice PET imaging provides semi-quantitative information about metabolic activities in human body, using the Standardized Uptake Value (SUV). The SUV scale, by itself, does not to establish thresholds between benign and malignant uptake in high-level analyses, such as pattern recognition. The objective of this work is to investigate in PET image volume with high-uptake regions, two additional descriptors, besides the SUV measurements: the amount of information given by the Hartley function (IHartley) and its expected value, the Shannon entropy (H). To estimate these descriptors, two models of the probability distribution were obtained from a high-uptake region of interest (ROI): (i) the normalized grayscale histogram from SUV intensity levels (Pi), which provides global IHG and HG; and (ii) the normalized gray level co-occurrence matrix (GLCM) of these graylevels (Pg,k) at the same range, which provides local IHL and HL. The beginning results have shown that for the ROI (12x12 pixels) and for mean SUV ranging of 6.6213±0. 5196 g/ml, with SUVMax = 14,7372 g/ml, the global entropy (2,3778±0,0364) has a higher average uncertainty that local entropy (2,2069±0,0758), with a confidence interval of 99.95% (pvalue < 0,05%). This can be explained by analysing the sample from the amount of information, IHartley, noting that on average local Pg,k provides up to 90,55±9,18% more information when compared to the amount of information given by global Pi. Therefore, these initial results suggest that, for build algorithms for PET image segmentations using threshold based in entropy measures, it is more appropriate to use a distribution functions estimator which considers the local information of the pixels intensities. The main application of this approach will be for, among other things, to construct pathological phantoms from PET images for dosimetry applications.
Downloads
References
Journal
CATANA C., PROCISSI D., WU Y., JUDENHOFER MS., QI J.,PICHLER B.J., JA-COBS R.E., CHERRY S.R., Simultaneous in vivo positron emission tomography and magnetic resonance imaging, 105(10):3705-10, 2008.
Book
ZAID H., Quantitative Analysis in Nuclear Medicine Imaging., 1st Edition, New York, Springer, 2006.
Journal
P. E. KINAHAN, J. W. FLETCHER, Positron emission tomography-computed tomog-raphy standardized uptake values in clinical practice and assessing response to therapy, Seminars in ultrasound, CT, and MR volume 31 (6) (1 December 2010) 496{505. doi:10.1053/j.sult.2010.10.001
Journal
FINBARR O’S., SUPRATIK R., JANET E., A statistical measure of tissue heterogeneity with ap-plication to 3D PET sarcoma data, Biostatistics (2003), 4, 3, pp. 433-448.
EL NAQA, I., P. GRIGSBY, A. APTE, E. KIDD, E. DONNELLY, D. KHULLAR, S. CHAUDHARI, D. YANG, M. SCHMITT, RICHARD LAFOREST, W. THORSTAD, AND J. O. DEASY, Exploring feature-based approaches in PET images for predicting cancer treatment outcomes, Pattern Recognit. Jun 1, 2009; 42(6): 1162–1171.
H. M., C. LE REST C., VAN BAARDWIJK A., L. P., E. A. PRADIER O, Impact of tumor size and tracer uptake heterogeneity in (18)F-FDG PET and CT non-small cell lung cancer tumor delin-eation., J. Nucl, 2011, 1690-7.
TIXIER F., REST C.C., HATT M., ALBARGHACH N., PRADIER O., METGES J.P., CORCOS L., VISVI-KIS D., Intratumor heterogeneity characterized by textural features on baseline 18FFDG PET images predicts response to concomitant radiochemotherapy in esophageal cancer., J. Nucl. Med., 2011.
Book
ZEMANSKY, M. Calor e Termodinâmica . 5. Ed.: Guanabara, Edição Brasileira, 1978.
COVER, T.; THOMAS, J. Elements of Information Theory. 1. Ed.: Wiley–Interscience Publica-tion, New York, 1991.
AVERY, J. Information Theory and Evolution, 1. Ed.: Cingapura: World Scientific, 2003.
COVER, T.; THOMAS, J. Elements of Information Theory. 1. Ed.: Wiley–Interscience Publica-tion, New York, 1991.
Journal
JANSING, E.; ALBERT, T.; CHENOWETH, D. Two-dimensional entropic segmentation. Pattern Recognition Letters, v. 20, p. 329–336, 1999.
Book
GONZALEZ, R.C. AND WOODS, R.E.(2008). Digital Image Processing, PEARSON- Prentice Hall, Reading.
Journal
ROUSSON, M. AND CREMERS, D. Efficient Kernel Density Estimation of Shape and Intensity Priors for Level Set Segmentation, G. Gerig (Ed.), Medical Image Comput. and Comp.-Ass. Interv. (MICCAI), Palm Springs, Oct. 2005. LNCS Vol. 3750, pp. 757–764.
Book
MCLACHLAN, G. AND PEEL, D. (2000). Finite Mixture Models Wiley, New York.
MCKINNEY, W. (2012). Python for Data Analysis, O’Reilly Media.
Website
PyDICOM User Guide, https://code.google.com/p/pydicom/, accesses in May 2014.
Book
BARBETTA, P.A., REIS, M.M and BORNIA, A.C., Estatística para Cursos de Engenharia e Infor-mática, 3th Edition, page 379, Atlas, 2010.
Journal
ORLHAC F, SOUSSAN M, MAISONOBE JA, GARCIA CA, VANDERLINDEN B, BUVAT I., Tumor texture analysis in 18F-FDG PET: relationships be-tween texture parameters, histogram indices, standardized uptake values, metabolic vol-umes, and total lesion glycolysis, J Nucl Med. 2014 Mar;55(3):414-22. doi: 10.2967/jnumed.113.129858. Epub 2014 Feb.
Published
Issue
Section
Categories
License
Copyright (c) 2015 Brazilian Journal of Radiation Sciences
This work is licensed under a Creative Commons Attribution 4.0 International License.
Licensing: The BJRS articles are licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/