General rightsIt is not permitted to download or to forward/distribute the text or part of it without the consent of the author(s) and/or copyright holder(s), other than for strictly personal, individual use, unless the work is under an open content license (like Creative Commons). Disclaimer/Complaints regulationsIf you believe that digital publication of certain material infringes any of your rights or (privacy) interests, please let the Library know, stating your reasons. In case of a legitimate complaint, the Library will make the material inaccessible and/or remove it from the website. Please Ask the Library: http://uba.uva.nl/en/contact, or a letter to: Library of the University of Amsterdam, Secretariat, Singel 425, 1012 WP Amsterdam, The Netherlands. You will be contacted as soon as possible. Essential biodiversity variables (EBVs) have been proposed by the Group on Earth Observations Biodiversity Observation Network (GEO BON) to identify a minimum set of essential measurements that are required for studying, monitoring and reporting biodiversity and ecosystem change. Despite the initial conceptualisation, however, the practical implementation of EBVs remains challenging. There is much discussion about the concept and implementation of EBVs: which variables are meaningful; which data are needed and available; at which spatial, temporal and topical scales can EBVs be calculated; and how sensitive are EBVs to variations in underlying data? To advance scientific progress in implementing EBVs we propose that both scientists and research infrastructure operators need to cooperate globally to serve and process the essential large datasets for calculating EBVs. We introduce GLOBIS-B (GLOBal Infrastructures for Supporting Biodiversity research), a global cooperation funded by the Horizon 2020 research and innovation framework programme of the European Commission. The main aim of GLOBIS-B is to bring together biodiversity scientists, global research infrastructure operators and legal interoperability experts to identify the research needs and infrastructure services underpinning the concept of EBVs. The project will facilitate the multi-lateral cooperation of biodiversity research infrastructures worldwide and identify the required primary data, analysis tools, methodologies and legal and technical bottlenecks to develop an agenda for research and infrastructure development to compute EBVs. This requires development of standards, protocols and workflows that are 'self-documenting' and openly shared to allow the discovery and analysis of data across large spatial extents and different temporal resolutions. The interoperability of existing biodiversity research infrastructures will be crucial for integrating the necessary biodiversity data to calculate EBVs, and to advance our ability to assess progress towards the Aichi targets for 2020 of the Convention on Biological Diversity (CBD).
The unprecedented growth, availability and accessibility of imaging data from people with neurodegenerative conditions has led to the development of computational infrastructures, which offer scientists access to large image databases and e-Science services such as sophisticated image analysis algorithm pipelines and powerful computational resources, as well as three-dimensional visualization and statistical tools. Scientific e-infrastructures have been and are being developed in Europe and North America that offer a suite of services for computational neuroscientists. The convergence of these initiatives represents a worldwide infrastructure that will constitute a global virtual imaging laboratory. This will provide computational neuroscientists with a virtual space that is accessible through an ordinary web browser, where image data sets and related clinical variables, algorithm pipelines, computational resources, and statistical and visualization tools will be transparently accessible to users irrespective of their physical location. Such an experimental environment will be instrumental to the success of ambitious scientific initiatives with high societal impact, such as the prevention of Alzheimer disease. In this article, we provide an overview of the currently available e-infrastructures and consider how computational neuroscience in neurodegenerative disease might evolve in the future.
Background and PurposeThe measurement of cortical shrinkage is a candidate marker of disease progression in Alzheimer’s. This study evaluated the performance of two pipelines: Civet-CLASP (v1.1.9) and Freesurfer (v5.3.0).MethodsImages from 185 ADNI1 cases (69 elderly controls (CTR), 37 stable MCI (sMCI), 27 progressive MCI (pMCI), and 52 Alzheimer (AD) patients) scanned at baseline, month 12, and month 24 were processed using the two pipelines and two interconnected e-infrastructures: neuGRID (https://neugrid4you.eu) and VIP (http://vip.creatis.insa-lyon.fr). The vertex-by-vertex cross-algorithm comparison was made possible applying the 3D gradient vector flow (GVF) and closest point search (CPS) techniques.ResultsThe cortical thickness measured with Freesurfer was systematically lower by one third if compared to Civet’s. Cross-sectionally, Freesurfer’s effect size was significantly different in the posterior division of the temporal fusiform cortex. Both pipelines were weakly or mildly correlated with the Mini Mental State Examination score (MMSE) and the hippocampal volumetry. Civet differed significantly from Freesurfer in large frontal, parietal, temporal and occipital regions (p<0.05). In a discriminant analysis with cortical ROIs having effect size larger than 0.8, both pipelines gave no significant differences in area under the curve (AUC). Longitudinally, effect sizes were not significantly different in any of the 28 ROIs tested. Both pipelines weakly correlated with MMSE decay, showing no significant differences. Freesurfer mildly correlated with hippocampal thinning rate and differed in the supramarginal gyrus, temporal gyrus, and in the lateral occipital cortex compared to Civet (p<0.05). In a discriminant analysis with ROIs having effect size larger than 0.6, both pipelines yielded no significant differences in the AUC.ConclusionsCivet appears slightly more sensitive to the typical AD atrophic pattern at the MCI stage, but both pipelines can accurately characterize the topography of cortical thinning at the dementia stage.
Evolution of brain imaging in neurodegenerative diseasesBrain imaging was regarded as an elective examination in patients with cognitive decline 15 years ago [1]. The practice parameters for diagnosis and evaluation of dementia defined by the American Academy of Neurology regarded computed tomography (CT) and magnetic resonance (MR) as 'optional' assessments [2,3]. Over time, imaging in dementia has moved from a negative, exclusionary role to one that added positive diagnostic and prognostic information. In the late 1990s, the traditional exclusionary approach was abandoned in favor of the inclusive approach [4,5]. Rapid advances in neuroimaging technologies such as PET, single photon emission CT, MR spectroscopy, diffusion tensor imaging and functional MRI have offered new vision into the pathophysiology of Alzheimer's desease (AD) [6] and, consequently, increasingly new powerful data-analysis methods have been developed [7].Since the beginning of the 21st Century, the development of innovative techniques for region-of-interest-based volumetry, automated voxel-based morphometry, cortical thickness measurement, basal forebrain volumetry and multivariate statistics have emerged [7][8][9] and those measurements most feasible and accurate have started to be used in clinical settings. The availability to the neuroimaging community of large prospective image data repositories has led to the development of web-based interfaces to access data and online image analysis tools to assess longitudinal brain changes [10][11][12][13].With the development of novel analysis techniques, the computational complexity of neuroimaging analysis has also increased signifi cantly. Higher spatial resolution images and longer time scans are being acquired so that more voxels will need to be processed for each acquisition. The same applies to the computational resources required by algorithms, since these have become increasingly central processing Neuroscience is increasingly making use of statistical and mathematical tools to extract information from images of biological tissues. Computational neuroimaging tools require substantial computational resources and the increasing availability of large image datasets will further enhance this need. Many efforts have been directed towards creating brain image repositories including the recent US Alzheimer Disease Neuroimaging Initiative. Multisite-distributed computing infrastructures have been launched with the goal of fostering shared resources and facilitating data analysis in the study of neurodegenerative diseases. Currently, some Grid-and non-Grid-based projects are aiming to establish distributed e-infrastructures, interconnecting compatible imaging datasets and to supply neuroscientists with the most advanced information and communication technologies tools to study markers of Alzheimer's and other brain diseases, but they have so far failed to make a difference in the larger neuroscience community. NeuGRID is an Europeon comission-funded effort arising from the needs of the Alzheimer's...
Background— There is no systematic assessment of available evidence on effectiveness and comparative effectiveness of balloon dilatation and stenting for aortic coarctation. Methods and Results— We systematically searched 4 online databases to identify and select relevant studies of balloon dilatation and stenting for aortic coarctation based on a priori criteria (PROSPERO 2014:CRD42014014418). We quantitatively synthesized results for each intervention from single-arm studies and obtained pooled estimates for relative effectiveness from pairwise and network meta-analysis of comparative studies. Our primary analysis included 15 stenting (423 participants) and 12 balloon dilatation studies (361 participants), including patients ≥10 years of age. Post-treatment blood pressure gradient reduction to ≤20 and ≤10 mm Hg was achieved in 89.5% (95% confidence interval, 83.7–95.3) and 66.5% (44.1–88.9%) of patients undergoing balloon dilatation, and in 99.5% (97.5–100.0%) and 93.8% (88.5–99.1%) of patients undergoing stenting, respectively. Odds of achieving ≤20 mm Hg were lower with balloon dilatation as compared with stenting (odds ratio, 0.105 [0.010–0.886]). Thirty-day survival rates were comparable. Numerically more patients undergoing balloon dilatation experienced severe complications during admission (6.4% [2.6–10.2%]) compared with stenting (2.6% [0.5–4.7%]). This was supported by meta-analysis of head-to-head studies (odds ratio, 9.617 [2.654–34.845]) and network meta-analysis (odds ratio, 16.23, 95% credible interval: 4.27–62.77) in a secondary analysis in patients ≥1 month of age, including 57 stenting (3397 participants) and 62 balloon dilatation studies (4331 participants). Conclusions— Despite the limitations of the evidence base consisting predominantly of single-arm studies, our review indicates that stenting achieves superior immediate relief of a relevant pressure gradient compared with balloon dilatation.
In this work we propose a new blockchain model that ensure the GDPR compliance by handling references to the sensitive data and using metadata instead of manipulate private data directly within the blockchain. We accomplish this by defining a modular architecture that relies on strong cryptographic assumptions that provide the means to guarantee that the right to be forgotten is being well enforced.
The MammoGrid project has recently delivered its first proof-of-concept prototype using a Service-Oriented Architecture (SOA)-based Grid application to enable distributed computing spanning national borders. The underlying AliEn Grid infrastructure has been selected because of its practicality and because of its emergence as a potential open source standards-based solution for managing and coordinating distributed resources. The resultant prototype is expected to harness the use of huge amounts of medical image data to perform epidemiological studies, advanced image processing, radiographic education and ultimately, tele-diagnosis over communities of medical 'virtual organisations'. The Mam-moGrid prototype comprises a high-quality clinician visualization workstation used for data acquisition and inspection, a DICOM-compliant interface to a set of medical services (annotation, security, image analysis, data storage and querying services) residing on a socalled 'Grid-box' and secure access to a network of other Grid-boxes connected through Grid middleware. This paper outlines the MammoGrid approach in managing a federation of Grid-connected mammography databases in the context of the recently delivered prototype and will also describe the next phase of prototyping.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.