The impact of positron emission tomography (PET) on radiation therapy is held back by poor methods of defining functional volumes of interest. Many new software tools are being proposed for contouring target volumes but the different approaches are not adequately compared and their accuracy is poorly evaluated due to the ill-definition of ground truth. This paper compares the largest cohort to date of established, emerging and proposed PET contouring methods, in terms of accuracy and variability. We emphasize spatial accuracy and present a new metric that addresses the lack of unique ground truth. Thirty methods are used at 13 different institutions to contour functional volumes of interest in clinical PET/CT and a custom-built PET phantom representing typical problems in image guided radiotherapy. Contouring methods are grouped according to algorithmic type, level of interactivity and how they exploit structural information in hybrid images. Experiments reveal benefits of high levels of user interaction, as well as simultaneous visualization of CT images and PET gradients to guide interactive procedures. Method-wise evaluation identifies the danger of over-automation and the value of prior knowledge built into an algorithm.
The scanning and computerized processing of images had its birth in 1956 at the National Bureau of Standards (NBS, now National Institute of Standards and Technology (NIST)) [1]. Image enhancement algorithms were some of the first to be developed [2]. Half a century later, literally thousands of image processing algorithms have been published. Some of these have been specific to certain applications such as the enhancement of latent fingerprints, whilst others have been more generic in nature, applicable to all, yet master of none. The scope of these algorithms is fairly expansive, ranging from automatically extracting and delineating regions of interest such as in the case of segmentation, to improving the perceived quality of an image, by means of image enhancement. Since the early years of image processing, as in many subfields of software design, there has been a portion of the design process dedicated to algorithm testing. Testing is the process of determining whether or not a particular algorithm has satisfied its specifications relating to criteria such as accuracy and robustness. A major limitation in the design of image processing algorithms lies in the difficulty in demonstrating that algorithms work to an acceptable measure of performance. The purpose of algorithm testing is two-fold. Firstly it provides either a qualitative or a quantitative method of evaluating an algorithm. Secondly, it provides a comparative measure of the algorithm against similar algorithms, assuming similar criteria are used. One of the greatest caveats in designing algorithms incorporating image processing is how to conceive the criteria used to analyze the results. Do we design a criterion which measures sensitivity, robustness, or accuracy? Performance evaluation in the broadest sense refers to a measure of some required behavior of an algorithm, whether it is achievable accuracy, robustness, or adaptability. It allows the intrinsic characteristics of an algorithm to be emphasized, as well as the evaluation of its benefits and limitations.More often than not though, such testing has been limited in its scope. Part of this is attributable to the actual lack of formal process used in performance evaluation of image processing algorithms, from the establishment of testing regimes, to the design of metrics. Selection of an appropriate evaluation methodology is dependent on the objective of the task. For example, in the context of image enhancement, requirements are essentially different for screen-based enhancement and enhancement which is embedded within a subalgorithm. Screen-based enhancement is usually assessed in a subjective manner, whereas when an algorithm is encapsulated within a larger system, subjective evaluation is not available, and the algorithm itself must determine the quality of a processed image. Very few approaches to the evaluation of image processing algorithms can be found in the literature, although the concept has been around for decades. A significant difficulty which arises in the evaluation of algorith...
Speckle techniques are investigated for the characterization of pavement surface microtexture, particularly, height variations from one up to about ten micrometers in amplitude. Using the scalar diffraction theory of Kirchhoff, some simulations of the speckle contrast were carried out to bring out some patterns in case of a subjective speckle whose grains are not resolved by CCD photodetectors and in the case of a two-scale surface texture. We deduce the possibility of characterizing the fineness of the microtexture or its evolution versus wear modifying. The method is also experimentally applied to some reference surfaces (abrasive papers with various finenesses) and some models ofpavement surfaces at various steps of wear.
Moire methods are optical neasurernent methods , that are based on the effect of superposition ofgrating lines and have been widely used in the context of industrial applications for shape analysis for non-contact measurements and for quality control of industrial components . In apf)hcations the following computations: image filtering fringe skeletonising and fringe numbering have to be performed for each test object , before comparison between the numerically reconstructed test object shape and its CAD. inodel . in oider to ieduce the computing time required by the preceeding computations , the inverse nioiré technique has been intoduced by Harthong. Instead of using a grating made of parallel straight lines , the inverse moire technique uses a pre-computed specific grating , that is formed of curved lines such that the moire pattern is composed of parallel straight fringes if the test object shape is conform to its CAD. model . Defects are then characterized by a deformation and a curvature of these parallel fringes . In this paper , we present examples showing tlia.t standard fringe extraction by automatic thresholding is not that. easy. ic) OV(ICOfl1( this (liihcultv, wc propose a four stage process a.lgorithrnical approach that. al lows fri nge det.ect.ioi i i i nverse n ot r i m ages with high sensitivity and sJ)ecificity. lirs we used the well-known image processing technique called urisharp masking , to enhaiice moire iiiage and to emphasize low contrasted fringes. The second SteJ) is to extract bright fringes by image segmentation and constrained contour modeling. After deletion of these bright fringes inside the zone of interest of the moire image, we get the thick skeleton of adjacent background and of dark fringes. The third step is to skeletonize this thick skeleton of adjacent background and of dark fringes, using morphological thinning of well-composed sets , that assures that each fringe skeleton will be one pixel thick ,at the difference of standard thinning techniques . The fourth step is to apply a graph technique to isolate the individual dark fringes. When all these four steps have been followed , one is left with a binary iniage showing the dark fringe pattern skeleton . The experimental results that have been obtained have shown the robustness of this algorithmical approach for the analysis of noisy iiiverse nioiré 111 iages 1 Introduction Moire methods are optical measurement metho(ls [2], [7] , [8] , [12], that are based oii the effect of superposition of grating lines . These techniques have 54 / SPIE Vol. 2786 0-8194-21 72-3/96/$6.00 Downloaded From: http://proceedings.spiedigitallibrary.org/ on 06/21/2016 Terms of Use: http://spiedigitallibrary.org/ss/TermsOfUse.aspx
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.